comprehensive evaluation
Download
Skip this Video
Download Presentation
Comprehensive evaluation

Loading in 2 Seconds...

play fullscreen
1 / 17

Comprehensive evaluation - PowerPoint PPT Presentation


  • 326 Views
  • Uploaded on

Comprehensive evaluation. Balance between Research Quality and Relevance (The Dutch Models) Jack Spaapen Coimbra Group – HSIS Dublin 19 September 2008. Polynesian Visual Art. Research impact Framework AHRC. interactions between research and society non-linear approach

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Comprehensive evaluation' - niveditha


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
comprehensive evaluation
Comprehensive evaluation

Balance between

Research Quality and Relevance

(The Dutch Models)

Jack Spaapen

Coimbra Group – HSIS

Dublin 19 September 2008

research impact framework ahrc
Research impact Framework AHRC
  • interactions between research and society
  • non-linear approach
  • metrics alone not enough
  • metrics, and impact assessment, and quality assessment [<-> knowledge exchange]
problems evaluating humanities social sciences mit
Problems evaluating Humanities / Social sciences / MIT
  • Bibliometrics not adequate when it comes to evaluating research quality  bad scores in evaluation procedures
  • Current indicators for societal relevance (patents, contracts) not so useful for humanities, and other fields
  • Lack of indicators for important communications to broader audiences, but new metrics for socio-cultural studies (NL)
  • General direction seems to be : traditional metrics only, (Australia, RQF  ERA, RAE in the UK, but…
  • Netherlands, other countries, are looking for more comprehensive methods
evaluating research quality under pressure
Evaluating research quality under pressure
  • Peer review : trouble with new developments, MIT research, socio-economic relevance, referee fatigue
  • Bibliometrics : main focus in ISI journals
  • Lack of indicators for important communications to broader audiences
  • General direction still seems to be : traditional metrics only, (Australia, RQF  ERA, RAE in the UK, but…
  • Netherlands, other countries, are looking for more comprehensive methods
struggle for comprehensive evaluation systems
struggle for comprehensive evaluation systems
  • Dimension 1 : metrics dominated by research practices of natural and biomedical sciences; inadequate for many fields
  • Dimension 2 : growing necessity to be relevant for economy and society
  • Dimension 3 : attuning scientific quality and societal relevance in evaluation
  • Dimension 4 : policy makers want simple metrics for reallocation purposes
many solutions are tried
Many solutions are tried….
  • UK Research Councils, AHRC, ESRC, also debate about RAE
  • Australia (RQF)
  • France, INRA
  • Norway, research councils
  • Denmark, radar graph….
  • Canada : HSSFC focus on impacts and performance)
  • HERA
development of new evaluation systems
Development of new evaluation systems
  • growing tension between policy makers / government and research community about how to account for research (criteria, indicators, metrics, but also too many evaluations, consequences)
  • growing tension between so-called scientific quality and societal relevance
2 debates
2 debates
  • Current National Evaluation System SEP 2003 – 2009
  • ERiC, Evaluating Research in Context
sep 2003 2009
SEP (2003 -2009)
  • Self evaluation report by research unit

review of past performance and forward looks (SWOT)

  • Focus in site visit report on 4 criteria:
          • quality (output, position internationally)
          • relevance (to policy, industry and society)
          • research management
          • accountability
  • Evaluation both retrospective and prospective

the accent is on the latter

  • External site visits every 6 years

every three years mid term evaluation

humanities social sciences many others are critical
Humanities, social sciences, many others, are critical
  • Criteria and indicators not geared to humanities, social sciences, technical disciplines
  • No instruments to evaluate social relevance
  • 2005 Academy councils (Humanities and Social Sciences) issued a report : Judging research on its merits
  • 2006 Advisory Council for S&T policy : Alfa stralen
  • 2007 Meta Evaluation Committee : Trust but verify
er i c project relevance

ERiC-project  relevance

Joint effort of the Academy, Research Council, university association, and others

Support institutions with the evaluation of societal quality / impact of research

Develop criteria and indicators, a methodology, for assessment

Suggest how to integrate these methods in new SEP (2009 – 2015)

4 common steps identified
4 common steps identified
  • Mission of research group or institute is starting point of evaluation
  • Identify productive interactions with social context : industry, policy, society at large
  • Data gathering : focus on research group’s performance in the various social domains, including stakeholder analysis : comprehensive profile of research group
  • Feed back and forward look
eric evaluation principles
ERiC evaluation principles
  • Comprehensive evaluation, focus on both scientific quality and relevance
  • Contextual : identify mission, involve stakeholders in indicator / benchmarking
  • Combine quantitative and qualitative data
  • Forward looking, focus on improving, learning, coaching in stead of judging
ad