Visit the DCC Website

About

Who should be involved

A key objective of the CARDIO assessment is to improve communication and understanding between the various stakeholders involved in data management and curation within an institution. Within CARDIO, we've grouped the various stakeholders into three broad categories:

Coordinators

This may be a bespoke post within an organisation, or a more established position such as a subject librarian. Their role is one of coordination between those generating data and representatives of those infrastructural services required to support its effective management. Expected to be an independent facilitator to both communities, the Coordinator is conscious of the roles and responsibilities of each, with their primary motivation the pursuit of wider organizational objectives consistent with good data management best practice.

Data Originators

In most cases likely to be a researcher, typically generating digital content and associated descriptive material, relying on infrastructural services to support its creation, curation and dissemination.

Service Providers

Within an academic context this role has high levels of heterogeneity; however service providers may include representatives from the library, information services, legal support, and financial services .

CARDIO assessments should involve a representative from each of these groups to ensure that data management needs and requirements can be fully understood and assessed against institutional infrastructure and staff resources. Every CARDIO assessment needs a Coordinator someone who will initiate and oversee the assessment. This doesn't necessarily have to be initiated by research staff or research support staff. It is just as valid to have an assessment initiated by central support staff such as repository managers.

Top tip: We recommend that CARDIO assessments are undertaken at the project, research group or departmental level in the first instance. A realistic scope helps to ensure that the assessment yields valuable results.

Getting started

Once you've identified your Coordinator and pinpointed who else needs to be involved at your institution, you're ready to get started. There are five principle phases of workflow, but several can be skipped or extended to reflect an organisation's circumstances. The tool is designed to reward a higher level of detail with increasingly focused, relevant and accurate results, but will never demand it.

Stage 1 - Starting Point

  • Coordinator registers their data management context, completing a short number of fields describing organisational and data facets in very broad terms. For example, they may be asked to rate technological storage infrastructure on a scale from 1 to 5.
  • Responses are compared to benchmarking metrics and an estimated level of data management capacity within that setting is issued to the user.
  • Evidence of real world provisions that match and exceed these levels are presented to support further tweaking of this initial assessment, and to prompt thoughts of what may be implemented in order to improve.

Stage 2 - Collaboration

A critical dimension of this tool is its multi-user focus. The Coordinator has the option of breaking up the questions within the tool and distributing them to the departments or individuals considered to have clearest responsibility in each area.
  • Representatives of all three stakeholder communities present a structured assessment of data management provisions within their institutional context (indicating their own perceptions of various aspects of data management capacity).
  • Respondents must also indicate their own perceived level of responsibility for that area.
  • Each party must respond with quantitative evaluation justified with accompanying comments and thoughts. These evaluations are intended to be as objective as possible, with accompanying documentation making explicit what must be practically demonstrable to satisfy each level of capability.
  • Respondents' confidence in their stated maturity ratings is challenged by exposing them to possible risks that may accompany that area.

Stage 3 - Clarification and Consensus

Following their individual assessments, the tool enters a subsequent phase, where individuals' responses are revealed to the group as a whole.

  • Perceptions match: In some cases the perceptions of infrastructural or researcher provisions (from the three stakeholder groups) will more or less match - this represents a consensus of the state of the data management context, which means that stakeholders can together reflect on opportunities for improvement.
  • Perceptions differ, responsibility clear: Sometimes the points of view of researchers and service providers differ. For example, technology service providers may believe their data backup provisions do satisfy curation requirements, but data originators consider them ill-suited. If this is the case, and responsibilities for the provision are clear:
    • recourse is made to the initial justifications that accompanied the original capacity level assessment. In some cases it will provide the necessary clarity to highlight misconceptions.
    • In other cases the Coordinator must arbitrate to determine the correct assertion. Even where those communities agree on current capacity, if the Coordinator does not, this will warrant further investigation, since an objective perspective is critical.
  • Perceptions differ, responsibility unclear: In such cases the role of the Coordinator becomes critical. They may allocate responsibility to an existing stakeholder, or alternatively highlight the lack of an appropriate responsible individual as a compelling risk; the assumption of responsibility becomes a key priority for subsequent improvement activity.

Stage 4 - Conclusion

Assuming that the previous stages yield common consensus of the capacity for data management within the contextual setting, this stage is about asserting in practical terms what this really means. The key outcomes include:

  • A visual representation of current areas of strength and weakness;
  • A list of practical recommendations for improvement based on legacy real-world evidence collated from the individual tools. Any given area may exhibit greater or lesser levels of maturity than others.

Stage 5 - Commitment

Representatives from each stakeholder group reflect on the conclusions that have been reached, and consider (by reference to the objective benchmarking evidence) what must be done in each area to improve, and progress to the next phase of maturity.

  • Requirements (what must be in place) are allocated to those responsible, who must in turn accept that responsibility and commit to deliver within a stated time period.
  • All areas will normally be reassessed simultaneously, during a subsequent iteration through the stages of the tool, but there may be particular areas prioritised for earlier or later assessment.
  • Users will be prompted by the tool when particular areas are due to be reviewed, and the previous phases are simply repeated.

It is anticipated that responses provided in justifying the adopted data management capability ratings will provide input for using those specialist legacy tools that precede this work. So for example if one is focusing on issues directly concerning the digital preservation environment any identified ambitions, activities or risks may be automatically imported into the DRAMBORA Interactive online tool.

 

Back to top

Notices | Contact | Cardio v2 | Knowledgebase vHATII ULCCDeveloped by the Digital Curation CentreFunded by JISC