Comprehensive evaluation
Download
1 / 17

Comprehensive evaluation - PowerPoint PPT Presentation

Comprehensive evaluation. Balance between Research Quality and Relevance (The Dutch Models) Jack Spaapen Coimbra Group – HSIS Dublin 19 September 2008. Polynesian Visual Art. Research impact Framework AHRC. interactions between research and society non-linear approach

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha

Download Presentation

Comprehensive evaluation

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Comprehensive evaluation

Balance between

Research Quality and Relevance

(The Dutch Models)

Jack Spaapen

Coimbra Group – HSIS

Dublin 19 September 2008


Polynesian Visual Art


Research impact Framework AHRC

  • interactions between research and society

  • non-linear approach

  • metrics alone not enough

  • metrics, and impact assessment, and quality assessment [<-> knowledge exchange]


Problems evaluating Humanities / Social sciences / MIT

  • Bibliometrics not adequate when it comes to evaluating research quality  bad scores in evaluation procedures

  • Current indicators for societal relevance (patents, contracts) not so useful for humanities, and other fields

  • Lack of indicators for important communications to broader audiences, but new metrics for socio-cultural studies (NL)

  • General direction seems to be : traditional metrics only, (Australia, RQF  ERA, RAE in the UK, but…

  • Netherlands, other countries, are looking for more comprehensive methods


Evaluating research quality under pressure

  • Peer review : trouble with new developments, MIT research, socio-economic relevance, referee fatigue

  • Bibliometrics : main focus in ISI journals

  • Lack of indicators for important communications to broader audiences

  • General direction still seems to be : traditional metrics only, (Australia, RQF  ERA, RAE in the UK, but…

  • Netherlands, other countries, are looking for more comprehensive methods


struggle for comprehensive evaluation systems

  • Dimension 1 : metrics dominated by research practices of natural and biomedical sciences; inadequate for many fields

  • Dimension 2 : growing necessity to be relevant for economy and society

  • Dimension 3 : attuning scientific quality and societal relevance in evaluation

  • Dimension 4 : policy makers want simple metrics for reallocation purposes


Many solutions are tried….

  • UK Research Councils, AHRC, ESRC, also debate about RAE

  • Australia (RQF)

  • France, INRA

  • Norway, research councils

  • Denmark, radar graph….

  • Canada : HSSFC focus on impacts and performance)

  • HERA


Development of new evaluation systems

  • growing tension between policy makers / government and research community about how to account for research (criteria, indicators, metrics, but also too many evaluations, consequences)

  • growing tension between so-called scientific quality and societal relevance


2 debates

  • Current National Evaluation System SEP 2003 – 2009

  • ERiC, Evaluating Research in Context


SEP (2003 -2009)

  • Self evaluation report by research unit

    review of past performance and forward looks (SWOT)

  • Focus in site visit report on 4 criteria:

    • quality (output, position internationally)

    • relevance (to policy, industry and society)

    • research management

    • accountability

  • Evaluation both retrospective and prospective

    the accent is on the latter

  • External site visits every 6 years

    every three years mid term evaluation


  • Humanities, social sciences, many others, are critical

    • Criteria and indicators not geared to humanities, social sciences, technical disciplines

    • No instruments to evaluate social relevance

    • 2005 Academy councils (Humanities and Social Sciences) issued a report : Judging research on its merits

    • 2006 Advisory Council for S&T policy : Alfa stralen

    • 2007 Meta Evaluation Committee : Trust but verify


    ERiC-project  relevance

    Joint effort of the Academy, Research Council, university association, and others

    Support institutions with the evaluation of societal quality / impact of research

    Develop criteria and indicators, a methodology, for assessment

    Suggest how to integrate these methods in new SEP (2009 – 2015)


    4 common steps identified

    • Mission of research group or institute is starting point of evaluation

    • Identify productive interactions with social context : industry, policy, society at large

    • Data gathering : focus on research group’s performance in the various social domains, including stakeholder analysis : comprehensive profile of research group

    • Feed back and forward look


    ERiC evaluation principles

    • Comprehensive evaluation, focus on both scientific quality and relevance

    • Contextual : identify mission, involve stakeholders in indicator / benchmarking

    • Combine quantitative and qualitative data

    • Forward looking, focus on improving, learning, coaching in stead of judging


    example REPP – table graph


    example of evaluation of societal quality – radar graph [concise format]


    example of evaluation of societal quality – radar graph [extended format]


    ad
  • Login