1 / 19

Research Reviews in the Netherlands A Practical Approach

Research Reviews in the Netherlands A Practical Approach. Roel Bennink, coordinator research reviews Quality Assurance Netherlands Universities www.qanu.nl. Contents. What? The Dutch System of Research Reviews Why? Aims and Owners of the System How? 1993-2003 and 2003-2009.

landon
Download Presentation

Research Reviews in the Netherlands A Practical Approach

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Research Reviews in the Netherlands A Practical Approach Roel Bennink, coordinator research reviews Quality Assurance Netherlands Universities www.qanu.nl

  2. Contents • What?The Dutch System of Research Reviews • Why?Aims and Owners of the System • How? 1993-2003 and 2003-2009

  3. Universities in the Netherlands • Fourteen research based universities (incl. OU) • 4.8 billion Euro; 50.000 staff, 160.000 students

  4. Funding of Research Three funding sources • Direct funding (60%) • Main source, stable, plus extra for dissertations and research schools (each about 10%) • Competitive funding (10%) • Contracts (30%)

  5. Research Reviews in NL • All publicly funded research must be submitted for external review every 6 years. • System started in 1993, organised by the universities collectively (VSNU) • Nationwide, per discipline (± 35) • New Protocol in 2003: SEP • Not always nationwide • (individual) Boards are responsible

  6. Standard Evaluation Protocol • Internal and external objectives combined: • Improving quality of research • Improving research management & leadership • Accountability to government and society • Object of the assessment: • Research quality by international standards • Depending on mission per Institute/Programme: • Social or economical objectives • Technical or infrastructural objectives

  7. Standard Evaluation Protocol • Four main aspects: • Quality • International recognition and innovative potential • Productivity • Scientific output • Relevance • Scientific and socio-economic impact • Prospects • Flexibility, management, leadership

  8. Five point scale • Excellent:- internationally leading - important and substantial impact • Very Good: - internationally competitive, national leader - significant contribution 3.Good: - internationally visible, nationally competitive - valuable contribution 2.Satisfactory:- nationally visible - adds to understanding • Unsatisfactory: - flawed, not worthy of pursuing

  9. Method: Self-analysis and Peer review • Close link between research management, quality control and accountability to higher levels • Multi-purpose data collection • Uniform criteria (all universities use SEP) • (sometimes) Citation analyses • (often) Simultaneous and comparative • (always) Public reports

  10. Descriptive elements: • Mission, leadership, strategy, policies • Research processes (teamwork, supervision, quality control) • Reputation (reviews, awards, citations) • Internal evaluation (management, culture) • External validation (spin-offs, stakeholder survey) • SWOT-analysis • Key publications (list of 5, copies of 3)

  11. Quantitative elements: • Research staff (tenured, non-tenured, PhD, support) per year • Funding (ministry; research councils; contracts) per year • Spending (personnel; other) per year • Results (publications)

  12. Evaluative questions • Documentation • What is missing, what is not needed? • How much effort is spent in producing the self-studies? • Committees • How are committee members selected? • What is a good site visit? What can go wrong? • What determines the quality of the committee reports? • Consequences • What are the effects of the reviews? What measures are taken by faculties/universities/funding agencies/ministries? • Lessons learned • What mistakes can be avoided? • What are critical success factors for research reviews?

  13. Documentation What is missing, what is not needed? • It is always too much and never enough • The SEP-set is what every Institute or Programme should have anyway • Bibliometrics are expensive, time consuming and only useful in some disciplines How much effort is spent in producing the self-studies? • That depends on what you already have • “It’s a lot of work, but very useful”

  14. Committees How are committee members selected? • Proposals by Faculties, approval by Boards (and QANU) • Members must be independent and unbiased internationally acknowledged experts What is a good site visit? • Honest and open discussions, well-prepared peers • Critical and constructive questions; good teamwork What determines the quality of the reports? • Public nature, feedback loop • Comparative overview; general chapters per subfield • Combination of scores and text (including recommendations) • Directed at management, not ministry

  15. Consequences What are the effects and measures? • Visibility is increased • Management dialogues are enhanced • Management information improves • Publishing in high impact international journals is stimulated • Groups are merged, extended, redirected or stopped • High marks are (sometimes) financially rewarded • Low marks lead to critical questions • Recommendations are taken seriously • Ministry is kept at a distance

  16. LessonslearnedinNL What mistakes can be avoided? • Don’t assess individuals, only groups • Don’t focus too much on the scores only, or on ranking • Don’t ask too far-reaching (‘strategic’) questions What are the critical success factors? • Keep it simple • Stay close to real organisational structures and management processes • Agree on uniform definitions and data • Cooperate with other universities • Build/buy and maintain research information systems (METIS RU).

  17. Weaknesses • Committees find it difficult to assess the Institute level • Scoring the ‘management’ was abolished • Central scheduling was abolished • Costs (time & money) remain an issue • Tools for measuring “exchange of knowledge” need further development • Managers (and journalists and politicians) attach too much value to rankings

  18. Strengths • SEP works as a basic and practical tool • Peer review is authoritative about programmes (main strength) • External reviews are a useful addition to quality assurance • External reviews shift power to faculty and university • Policy decisions at the responsible level are supported • Reviews are used to look ahead • Peers from abroad add international dimension to quality • Cooperation between universities facilitates benchmarking

  19. Any questions?

More Related