1 / 26

What do we want?: reports from practising researchers in three UK universities

What do we want?: reports from practising researchers in three UK universities. Bruce Beckles University of Cambridge Computing Service. David Spence (Reading and Oxford) Luis Martinez (Oxford). Background. Mid-2004, Cambridge:

bo-mooney
Download Presentation

What do we want?: reports from practising researchers in three UK universities

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. What do we want?: reports from practising researchers in three UK universities Bruce Beckles University of Cambridge Computing Service David Spence (Reading and Oxford) Luis Martinez (Oxford)

  2. Background Mid-2004, Cambridge: Post of e-Science Specialist, University Computing Service (UCS) created: …the UCS “champion” of eScience [sic] within the University, raising the profile of what is available. Late 2007, Reading and Oxford: Shared post (Reading IT Services / Oxford e-Research Centre) of e-Science Development Officer created

  3. Problem It is difficult to be an effective champion of something if one does not understand its relevance to the community to whom one is championing it

  4. Solution Understanding the work practices of a community is a pre-requisite to determining what is appropriate for that community

  5. Methodology (Cambridge) Snowball technique to find interviewees: • Initial contact usually IT support personnel In-depth semi-structured interviews: • Interviewees promised anonymity • Usually just one interviewer (Bruce) • Usually just one interviewee; occasionally groups of 2-3 interviewees • Interview audio recorded where agreed

  6. Methodology (Oxford) Focussed on those involved with e-Research Informal interviews: • One of two interviewers (David, Luis) • Primary focus of Luis’ interviews was research data management Also information gathered from websites

  7. Methodology (Reading) Snowball technique to find interviewees: • Started from e-Research advisory group; initial contact usually IT support personnel Informal interviews and an on-line survey: • Only one interviewer (David) • All based on same fixed set of questions Second on-line survey to rank results: • Disseminated more widely within institution

  8. Scope

  9. Subject Areas

  10. Staff Type

  11. Analysis (to date) Common categorisation of issues raised in each institution: • Partly informed by classification of “barriers” from e-Uptake/ENGAGE Simple count of number of (adjusted) interviews in which these issues have been raised

  12. Overall Aggregated Issues Interviews: 182

  13. Top 10 Overall Issues Interviews: 182

  14. Across Institutions Percentage figures are the % of interviews at the specified institution raising issue

  15. Across Institutions Percentage figures are the % of interviews at the specified institution raising issue

  16. Sample Bias Cambridge: • Interviewers’ interest in training issues Oxford: • Interviewers’ interest in data issues (particularly Luis’ interviews) • Primary aim to find possible new collaborations for OeRC (“expertise register”) Reading: • Snowball seeded by e-Research advisory group • Interviewer (David) same as one of the interviewers in Oxford • Primary aim to discover current work and inform future strategy

  17. Top 10 Cambridge Issues Interviews: 83 6 2 3 1 4 5 Aggregated Training needs: 86% Figures in red are the ranking of that issue in the overall aggregated issues

  18. Top 10 Oxford Issues Interviews: 59 1 5 4 2 3 6 Figures in bold are the ranking of that issue in the overall aggregated issues

  19. Top 10 Reading Issues Interviews: 40 1 5 3 2 4 Figures in red are the ranking of that issue in the overall aggregated issues

  20. Across Subject Areas Percentage figures are the % of interviews in specified subject area raising issue

  21. Across Subject Areas Percentage figures are the % of interviews in specified subject area raising issue

  22. Natural/Formal Sciences vs. Social Sciences & Humanities Percentage figures are the % of interviews in specified type of subject raising issue

  23. Natural/Formal Sciences vs. Social Sciences & Humanities Percentage figures are the % of interviews in specified type of subject raising issue

  24. Top 10 Natural/Formal Sciences Issues Interviews: 134 1 2 3 6 4 5 Figures in red are the ranking of that issue in the overall aggregated issues

  25. Top 10 Social Sciences & Humanities Issues Interviews: 48 5 1 2 4 3 6 Figures in red are the ranking of that issue in the overall aggregated issues

  26. Questions?

More Related