1 / 21

Survey Results and extended use cases (CIA and ROS)

Survey Results and extended use cases (CIA and ROS). James Toon University of Edinburgh @jamestoon. CIA Scenarios. Researcher moving to another research organisation Researcher uploading data to research council at end of project. HEI => HEI. HEI => ROS. Survey scope.

mikasi
Download Presentation

Survey Results and extended use cases (CIA and ROS)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Survey Results and extended use cases (CIA and ROS) James Toon University of Edinburgh @jamestoon

  2. CIA Scenarios • Researcher moving to another research organisation • Researcher uploading data to research council at end of project HEI => HEI HEI => ROS

  3. Survey scope • To collect data that would allow a before/after comparison for data exchange • Two surveys, one for each use case • To use the findings to try and test scenarios to see if previously held efficiency claims are realistic • To try and identify any clear gaps and possible extensions to CIA use cases.

  4. About the CIA survey • 20 institutional responses. Poor • Survey open 28th Aug – 5th Oct • Distributed across number of lists, but particularly interested in ARMA respondents. • Why the poor response? Don’t really know, but maybe lack of understanding of the area?? • Produced using Bristol Online Surveys

  5. Q1. Respondent role types

  6. Q2. Within your institution, who has responsibility for transferring research information to or from core systems?

  7. Q3. Which of the following types of research information data are typically requested by staff members for transfer between institutions?

  8. Q4. What are the typical challenges faced when working on the transfer of research information data in or out of an institution?

  9. Q5 Do you have formal information transfer service

  10. Q6/7Process and Frequency • Q6/7 asked for indication of process, time and effort for effecting transfer of data • In general results indicate no clear approach, and low frequency ad-hoc activity • Different role responses suggest not too much ‘joined up thinking’ • For example;

  11. Q6/7 Significant variation • CRIS/Repository Manager “Research Support office does this. I expect it takes around 5 minutes in total to find, extract, format and send data.” • Research Support Officer “Download from Research Information System plus additional download of grants information from research grants database and/or finance system. Estimate of effort: 0.5 day”

  12. Q8 Additional comment • Q8(Final Question) asked for any additional comments on the transfer of data. • Respondents painted a picture of a developing requirement • A need to understand local context • That the desire to standardise is very welcome, but that it’s also very early days..

  13. Survey Synopsis • It's primarily about the money • There is a demand for non-publication output data - such as esteem indicators, impacts etc. • Requests to transfer data in or out of an institution for HE-HE transfer are ad hoc at best • For the HEI-HEI We seem to be asking about a problem that's not seen as a problem.

  14. Immediate thoughts for extended use cases • Lack of any clear HEI to HEI demand identified. Want to investigate this more. (discussion on demand/lack of demand invited) • Obvious demand in bulk importing identified from the ROS survey work - HEI-RCUK (50% submissions by bulk approach) • Also obvious lack of structured data management for non-publication impact/esteem data from CIA survey.

  15. Roadmap • Practical adoption of CERIF now a reality • Leadership needs identified as critical1. Now coming from RCUK members/HEFCE • The barriers to adoption are now diminishing - mainly practical i.e. REF more important at the moment, capital outlay. • Some barriers still substantial – for example standardisation of data types/classifications needs to be agreed and cascaded down to HEI installations The Business Case for the Adoption of a UK Standard for Research Information Interchange. Stuart Bolton Report to JISC July 2010 http://www.jisc.ac.uk/media/documents/publications/reports/2010/Businesscasefinalreport.pdf

  16. What next for CIA – extended use cases • Complete mapping of RCUK ROS/Researchfish entities to CERIF and implementation in local systems. • Define taxonomy of common RIM data types and establish as data sources • Benchmarking – data re-ingest in local systems from RCUK/HEFCE to institutions • Information sharing for public/researcher use • Subject or Geographic Aggregations (engagement with Gateway to Research) • Dynamic Linking of data at the institutional level (to support collaboration opportunities)

  17. RCUK Ros Survey • 236 replies • 79.2%    Principal Investigator • 11.9%    Research Office Manager / Administrator • 5.9%      Delegate (Co-investigator , associate researcher) • 3.0%      Institute Manager / Administrator

  18. Headlines • ROS Ease of use • 64.2% satisfactory or better • Look up services (useful or very useful) • DOI - 51.9% • ISBN/ISSN - 44.7% • ROMEO guidance - 28.2% • Pubmed - 27.3% • 67.8% said that they use an Institutional Repository or CRIS • No Research Office Managers answered this question!

  19. Analysis of Uploading method • Even split between single submission through the website vs bulk upload • Submit by lookup reference ie DOI = average 1 minute to submit • Submit through web = 4 and 8 minutes • Bulk submit = 1 and 3 minutes per outcome to prepare • Total Community effort per month • If 5 minutes per single outcome then 214 "working" days • If 2 minutes per bulk outcome then 90 "working” days • A 57% reduction in effort through using a bulk submit feature

  20. Reporting costs..… • The “reporting” cost per grant per year • £15.40 using single method • £6.50 using bulk submit • CERIF business case was based on application submission savings but… • £0.50p for CERIF?

  21. Questions? • Note: Have temporarily re-opened survey until 26th October to encourage further responses. https://www.survey.ed.ac.uk/cia_r2/

More Related