1 / 19

Towards an evaluation framework: some components from JISC work

Tuesday 8 May 2012 SIM4RDM Project Meeting, Utrecht. Towards an evaluation framework: some components from JISC work. Simon Hodson JISC Programme Manager, Managing Research Data. SIM4RDM Evaluation Framework.

cerise
Download Presentation

Towards an evaluation framework: some components from JISC work

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Tuesday 8 May 2012 SIM4RDM Project Meeting, Utrecht Towards an evaluation framework: some components from JISC work Simon HodsonJISC Programme Manager, Managing Research Data

  2. SIM4RDM Evaluation Framework • Aim and deliverable: develop ‘a model and framework that can be used for evaluating funding programmes developed using the [intervention] models from WP3, including what metrics to monitor, how to ascertain impact etc.’ • Provides: • ‘a benchmark from which to evaluate the intended change and improvement of funding interventions by programme owners’ • ‘mechanisms for monitoring funding interventions and their effectiveness in improving research data management skills and support’

  3. Evaluation Methodologies • Evaluation Methodologies from JISC and elsewhere. • Potential components for the evaluation framework. • Identifying costs • Defining and quantifying benefits • Maturity models for benchmarking • Understanding immediate and mediated impacts • SIM4RDM to examine, evaluate, consider their broader applicability; incorporate them into an evaluation framework.

  4. Questions? • How are we evaluating these guides/tools (and others)? • Is the methodology sound? • Do they approach the challenge in an effective way? • Do they help gather appropriate information? • Are they adaptable to other contexts? • Can they form the basis of an evaluation framework? • What other methodologies are available which may be useful? • How may they be constructed into an evaluation framework?

  5. Identifying Costs • Various models for identifying costs of digital preservation of research data. • Keeping Research Data Safe (activity based cost model, identification of cost variables): http://www.beagrie.com/krds.php • Includes activity model, cost drivers, resources template. • Danish Cost Model for Digital Preservation (: http://www.dcc.ac.uk/resources/external/cost-model-digital-preservation • KE Nordbib Workshop: http://www.knowledge-exchange.info/Default.aspx?ID=512 • DANS work…

  6. Defining and Quantifying Benefits • KRDS-I2S2 Toolkit: http://beagrie.com/krds-i2s2.php • Developed from KRDS, from working with the MRD Programme and from subsequent project funding. • Comprises: • Benefits framework: a guide to identifying, assessing and communicating benefits. • Value chain and benefits impact tool: a tool for ‘quantifying benefits/impact’ and targeting interventions (within a lifecycle model). • Appraised by ADS, UKDA: used for analysis and to provide evidence, felt to be useful. • Not clear how easy to adopt in institutions? • Could be better on suggested metrics?

  7. Incentives and Benefits for RDM and Sharing • 37% projected saving in staff time and infrastructure costs from moving Oxford Roman Economy Project database to centralised virtual service • One-day delay cut to 5 minutes: Estimated time saving for crystallography researchers to access results from Diamond synchrotron, by deploying digital processing pipeline & metadata capture system. • Making the Case for RDM, DCC Briefing Paper: http://www.dcc.ac.uk/resources/briefing-papers/making-case-rdm

  8. Incentives and Benefits • Benefits Identified by JISC MRD Projects • Improved organisation and retrieval of data > improved research efficiency, better use of research time • Secure data storage and backup > avoidance of cost of data loss • Improved awareness of departmental data assets > better reuse of data • Platform for data sharing, citation and discovery > better reuse of data, new research opportunities • Clear and accessible guidance > improved practice, compliance with funder requirements • Improved systems for design and execution of data management plans > easier and more frequent production of DMPs > better compliance with funder requirements • Efficiency savings through centralised and coordinated support, hosting > cost benefits

  9. Incentives and Benefits • Benefits Case Studies: http://www.jisc.ac.uk/whatwedo/programmes/mrd/outputs/benefit_studies.aspx • Report on Benefits from Infrastructure Projects in the JISCMRD Programme: http://www.jisc.ac.uk/whatwedo/programmes/mrd/outputs/benefitsreport.aspx

  10. How MaDAM Met User Requirements MaDAM met identified user requirements by: • providing an accessible and simple platform for the organization and annotation of research data • providing trusted secure storage to reduce risks of data loss • providing a mechanism to make metadata visible and searchable • facilitating easier, more secure owner-controlled data sharing • reducing redundancy/duplication by making existing resources easier to discover • maintaining media format accessibility for long-term reuse • enabling compliance with legal and funder obligations

  11. JISCMRD Programme Evidence Gathering • Building evidence for project business cases. • Building evidence for benefits at a programme level. • Identify likely benefits; identify likely evidence and metrics. • Present these via blogs and reports to form an evidence base. • Evidence Gatherers’ Blog: http://mrdevidence.jiscinvolve.org/wp/

  12. Maturity Model for Benchmarking CARDIO: http://www.dcc.ac.uk/projects/cardio ; http://cardio.dcc.ac.uk/

  13. Maturity Model for Benchmarking • CARDIO ‘Lite’: see introduction to RDM activities, DCC Roadshow: http://www.dcc.ac.uk/events/data-management-roadshows/dcc-roadshow-northeast • UWE ‘Case Study’: http://www1.uwe.ac.uk/library/usingthelibrary/servicesforresearchers/datamanagement/managingresearchdata/projectoutputs/workpackages12/casestudy.aspx • In particular, ‘benefits and risks matrix’; and ‘enterprise maturity model’.

  14. Seeking win + win + win + win + win…… Where do I safely keep my data from my fieldwork, as I travel home? How do we ensure we have access to our research data after some of the team have left? How can our research collaborations share data, and make them available once complete? How can I best keep years worth of research data secure and accessible for when I and others need to re-use it? How do we ensure compliance to funders’ requirement for several years of open access to data? individual researcher PhD student supra-university research team university LEVEL Research Integrity, London - Sept 2011

More Related