130 likes | 250 Views
Snowball project aims to define common research metrics for higher education institutions to enhance decision-making. Evolving from dissatisfaction with existing tools, it seeks to establish global sector standards for data and metric definitions. Through collaboration with Elsevier, the project has shown the feasibility of generating benchmarks from various data sources. The next steps involve refining metrics and promoting global adoption.
E N D
Project Snowball – sharing data for cross-institutional benchmarking Lisa Colledge, Anna Clements, M’hamed el Aisati, Scott Rutherford euroCRIS 2012 with modifications for JISC RIM Meeting Bristol Jun 28th 2012
The aims of Snowball • Higher education institutions • Agree a common set of metrics to support institutional decision making • Reach consensus on standard methodologies for calculating these metrics • Publish the “recipe book” as open standard definitions • These metrics will cover the entire landscape of research activity • These metrics will become global sector standards
The origins of these aims... BACKGROUND • Growing recognition of value of metrics to support strategies • Dissatisfaction with the tools available • Frustration over availability of metrics to make sensible measurements • Institutions and funders should work more collaboratively, and develop stronger relationships with suppliers • An agreed national framework for data and metric standards is needed • Suppliers should participate in the development of data and metric standards Joint Imperial-Elsevier JISC-funded study of research information management, available via http://www.projectsnowball.info/ RECOMMENDATIONS
Snowball has evolved from these recommendations • Agree methodologies for a standard set of metrics to support strategic decision making • Driven by higher education institutions - with recognised common challenges and goal - working with a supplier (Elsevier), with everyone contributing voluntarily The goal? To enable cross-institutional benchmarking
Comprehensive metrics landscape Metrics require institutional, proprietary and third party data
Test 1 to calculate the metrics landscape Approach: institution and Elsevier contribute data on 10 chemistry researchers as proxy for the whole university Definitions of metrics Data availability across landscape Sensitivity of some data types (next slide) Researcher-level data Manual labour in data collection
Test 2 of metric calculation feasibility Approach: institution and Elsevier test scalability by contributing data on whole university for a smaller set of metrics Definitions of metrics Experts group formed to select and define phase 1 metrics – impactful, do-able, require data from 3 sources Data availability across landscape Sensitivity of some data types Data agreement prepared by partners Most sensitive data types not phase 1 Researcher-level data Used minimally Metric granularity Manual labour in data collection Institution and Elsevier supply data as close to native as possible
Test 2 of metric calculation feasibility HESA FTE research, reserch & teaching HESA cost centre Metrics require institutional, proprietary and third party data
Project Snowball recap • Driven by sector • Facilitated and supported by Elsevier • Public service The project has demonstrated feasibility of scalably inputting data from 3 sources to generate metrics and benchmarks • Institutional • Proprietary • Third party
Next steps • Publish the phase 1 metrics “recipe book” as open standards – Sep 2012 • Refine phase 1 metrics as global standards, and extend same approach to more metrics • CERIFy metrics – meeting scheduled Sep 2012 • Spread the word – Russell Group, 94 Group, Vendors, Funders
Thank you for your attention www.projectsnowball.info l.colledge@elsevier.com j.green@imperial.ac.uk anna.clements@st-andrews.ac.uk