1 / 22

Research Excellence Framework (REF) and Impact

Research Excellence Framework (REF) and Impact. Ellie James Research & Enterprise Services. Dual support system. QR funding from RAE/REF. Research funding. Research grants. 2009/10 Keele’s total income for research £18.3m £6.6m from QR (RAE 2008 results) £11.7m from research grant income

peers
Download Presentation

Research Excellence Framework (REF) and Impact

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Research Excellence Framework (REF)and Impact Ellie James Research & Enterprise Services

  2. Dual support system QR funding from RAE/REF Research funding Research grants • 2009/10 Keele’s total income for research £18.3m • £6.6m from QR (RAE 2008 results) • £11.7m from research grant income • Research Assessment • National exercise, every 6 years • Purpose to assess, by discipline (UoA), the quality of research in each University, since last RAE (enables benchmarking) • Process is based on ‘peer review’ • Results are the basis for allocating QR funding (in HEFCE grant letter) 2

  3. Definitions of quality levels 3

  4. Previous Keele’s RAE submissions RAE 2001 In RAE 2001 357.7 FTE staff were submitted across 27 Units of Assessment 80% of academic staff submitted Two thirds of Keele’s submissions rated ‘4’ or above (national to international excellence) RAE 2008 More selective approach, with high quality focused submissions Less staff have been submitted to fewer UoAs: 286.15 FTE staff submitted to 14 Units of Assessment 48% of academic staff have been submitted 4

  5. RAE 2008 Results All Universities’ results i.e. 52,409 FTEs in 2,363 number of submissions Keele’s results i.e. 286 FTEs submitted to 14 UoAs 5

  6. REF Guidance documents Comprehensive information on preparing submissions and on how panels will assess them will be set out in: • ‘Assessment framework and guidance on submissions’ (July 2011) • ‘Panel criteria and working methods’: • July 2011: published in draft form for consultation • Jan 2012: in final form.

  7. REF main changes from RAE Inclusion of assessment of ‘non academic’ impact of research Standardise three elements across UoAs Reduce number of UoAs i.e disciplines(from 67 to 36) and main panels (15 to 4) Structured templates (for consistency) Using standardised HESA data Limited use of citation data in some UOAs Measures to promote equality & diversity Quality profiles in steps of 1% not 5% Removal of ‘esteem’ as a distinct element 7

  8. Content of submission

  9. REF weightings across UoAs 15% 65% 20%

  10. Impact definition – read carefully! • An effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, BEYOND ACADEMIA • It includes an effect, change or benefit to: • The activity, attitude, awareness, behaviour, capacity, opportunity, performance, policy, practice, process or understanding • Of an audience, beneficiary, community, constituency, organisation or individuals • In any geographic location whether locally, regionally, nationally or internationally

  11. Impact definition continued Impact ……. • that has taken place during the assessment period (2008 to 2013), and was • underpinned by excellent research (2*) produced by the submitting institution (1993 to 2013) • Includes reduction or prevention of harm, risk cost or other negative effects • Panels will provide (not restrictive) guidance on kinds of impact they would anticipate in their UOA, and on appropriate forms of evidence

  12. Impact categories REF = retrospective impact RCs = prospective impact

  13. Methodology • Based on expert peer review of case studies (80%) • Impact template - Submissions will include contextual & strategy info about how the unit has supported and enabled impact between 1st Jan 2008 to 31st July 2013 (20%) • Impacts at any stage of development, but must have taken place during period • NOT future or potential impacts! • NOT dissemination activity without evidence of benefits

  14. Number of case studies (80%) Case studies not expected to be representative of spread of research

  15. Format of case studies (80%) • Generic template with word limits: • Summary of impact (100 words • Underpinning research (500 words) - how the research made a ‘material and distinct’ contribution to the impact • References to the research (max of 6) • Details of the impact (750 words) - to explain how research underpinned impact and the nature and extend of impact, i.e. who/what was affected? how were they affected? • Sources to corroborate the impact (max 10) References to independent sources that could verify claims See Annex G for further details

  16. Format of Impact template • Statement to describe the units approach to supporting and enabling impact during the assessment period including: • Context • Approach during 2008-2013 • Strategy and plans for supporting impact • Relationship between the approach and the submitted case studies

  17. Attribution and timeframe • Must show that Keele undertook research that made a distinctivecontribution to achieving the claimed impact or benefit • Underpinning research up to 15 years before 2008 (i.e. 1993 to 2013) • Some UoAs may extend this timeframe by a further 5 years

  18. Assessment • Produces an impact sub profile (20%) • Assessed against reach and significance • ‘Reach’ how widely the impacts have been felt • ‘Significance’ how transformative the impacts have been • REF panels to provide more info on criteria in working methods • Impacts cannot be compared across UoAs • Expert users will be involved, alongside academics (workshops during 2011)

  19. Lessons learned from pilot HEIs • ‘Clarity of presentation’ is key (writing teams!) • Takes time to understand concept of ‘impact’ or ‘ non academic impact’ or ‘benefit.’ HEIs need to raise awareness NOW • Interim impact ambiguous compared with final impact • Standard approachs emerged: • Central admin project managing submissions • Department academics leading drafting • High level committee reviewing and tactical advice • Big challenge acquiring supporting evidence (heavy reliance on personal knowledge of senior academics) • Impact will impose real additional cost • Subject specific challenges e.g. English more conceptual

  20. Lessons learned from pilot Panels • Best case studies make explicit the non academic benefit from research • ‘Brief is best’ • Good case studies showed the link between research and impact and provided supporting evidence • Case studies can get high rating on either ‘reach’ or ‘significance’ (or both) • Engagement isn’t impact • Not convincing to simply state ‘distinguished Professor’ • Universities need to improve their presentation of evidence • Issues for new departments, early career researchers, small submissions • Don’t expect panels to follow up references, these are just for verification

  21. Timeline

  22. Impact at Keele • Most RIs are now identifying case studies for each UoA (with reserves!) • These will need carefully crafting and redrafting, and supporting evidence collating • Impact template will also need support • Impact support at Keele: • Research support (Ellie & Nicola) • Enterprise & Business Managers • PVC Research & Enterprise and Head of RES • RI Managers & Directors • Research group/UoA leads

More Related