1 / 18

The role of evaluation and ranking of universities in the quality culture

The role of evaluation and ranking of universities in the quality culture. Professor Jean-Marc Rapp EUA President 2 July 2009. Overview. Evaluation vs. Ranking –different concepts Evaluation – an EUA perspective EUA´s quality policy

adair
Download Presentation

The role of evaluation and ranking of universities in the quality culture

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The role of evaluation and ranking of universities in the quality culture Professor Jean-Marc Rapp EUA President 2 July 2009

  2. Overview • Evaluation vs. Ranking –different concepts • Evaluation – an EUA perspective • EUA´s quality policy • EUA´s activities focusing on the development of a quality culture • Rankings – an EUA perspective • Reflection on current initiatives • Rankings and their impact on quality culture • Conclusion – the way forward

  3. Ranking Quality Assurance/evaluation Judgement of strengths & concerns in a number of measures related to input, process and output of HE -> quality enhancement Data always obtained from HEI as a self-evaluation report Almost always involve site visit by peers • Relative positions in participating groups <- mathematic “formulas” <- performance on a number of selected measures • Independent data or obtained or usually verified by HEI • Usually no site visit

  4. Evaluation & Quality Assurance

  5. Key EUA activities in the field of QA • Institutional Evaluation Programme since 1994 • Quality Culture –project 2002 – 2006 • Creativity project 2007 • Quality Assurance for Higher Education Change Agenda (QAHECA) 2008-2009 • European Quality Assurance Forum in co-operation with other E4 partners (ESU, ENQA and EURASHE) • Founding member of EQAR

  6. Quality Culture

  7. EUA’s Policy Positions on QA/Evaluations • Main responsibility for quality assurance lies with the institutions • Context sensitive (institutional and disciplinary diversity) • Fitness for purpose approach • Enhancement oriented • Internal and external evaluations or QA processes should be complementary • Transparency and co-operation

  8. Rankings

  9. The present landscape 1-Global initiatives • Global rankings: • Shanghai ARWU • Times-SQ World University Ranking • Leiden Ranking Newspaper driven Emerging Global Model (EGM) of a ‘world class university’ • Therefore, rankings increasingly reflect the prestige and reputation of HEIs according to one specific model • OECD feasibility study for the international assessment of HE Learning Outcomes: AHELO

  10. The present landscape 2 - European initiatives • European Commission feasibility study to develop multi-dimensional university ranking • EU Commission supported statistical database on Higher Education (via Eurostat) • European Commission has supported projects to develop a Classification of European HEIs • DG Research Expert group is working on methodologies for University-Based Research Assessment • CHE (D) - various classification initiatives with different foci (universities, research rankings, departmental excellence, employability rating)

  11. The present landscape – some observations.. • Significant limitations of existing rankings: • Not comprehensive: provide an incomplete & once-off snapshot of small segment of a rapidly changing sector • ‘One-size-fits-all’ methodology: do not take account of increasingly differentiated HE landscape in Europe • Lack of transparency in the way they are compiled • Compilers use availabledata rather than compiling data • Reflect largely reputational factors (40% THES) • Dominance of research and metrics – little focus on other missions of the university • Therefore, existing rankings typically favor old, large, Anglo-Saxon research intensive institutions with +/- 24,000 students and a $2 billion annual budget • Therefore, existing rankings typically favor old, large, research intensive institutions with about 24,000 students and a 2 Billion annual budget

  12. The present landscape – some observations.. • Despite the commonly acknowledged limitations, rankings are increasingly used • Institutions: • Governments: • seek to influence compilers (HEFCE 2008) • senior managment KPIs are influenced (HEFCE 2008) • change promotion & marketing efforts (HEFCE 2008) • argue for a “value added” approach • increased interest in transparency instruments • at the HE system level (Leuven Communique 2009) • key priorities such as LLL, widening access not • accounted for in the rankings  explore alternatives

  13. Rankings – what is their purpose? • The common “politically correct” purpose: Providing transparent information to students & reflect the prestige of institutions • What are the “real” purposes? - drive research and teaching performance - to allocate and lobby for resources - to identify stakeholders and partners - to promote other policy objectives

  14. Rankings & quality issues • Rankings increasingly equated with quality standards, which is a danger, as: • externally definedindicators that are not necessarily linked to an institution’s core mission and objectives • some HEIs are tempted to chase rankings and focus on improving what can be measured/indicators rather than focus on their core mission • rankings are based on a one-size-fits-allmethodology that does not take account of diversity • Poor positioning in the rankings can have a negative impact on staff morale (HEFCE 2008)

  15. Rankings – the way forward? • For those developing rankings: promote the use of Berlin Principles (CEPES, CHE, IHEP, 2006) : • Recognise the diversity of HEIs & take account of different missions & goals • Be transparent regarding methodology • Measure outcomes in preference to inputs • Use audited & verifiable data wherever possible • Provide consumers with a clear understanding of the factors involved & offer a choice in how they are displayed i.e. attach their own weightings

  16. Ranking – EUA´s response • The debate on rankings has been launched in EUA policy bodies –Board and Council • There has also been discussion in policy dialogue with Asian universities • EUA has established an internal working group on rankings to consider next steps – Proposals will be made to October 09 Council meeting • EUA has commissioned a study on institutional diversity • EUA continues to advocate that rankings should not be used as a proxy for quality & thus for QA purposes ..

  17. Conclusions ... • There is a fundamental difference between Quality Assurance and rankings: • QA process should always be internally driven (even if there are external incentives) and aim at enhancing the quality of activities (usually through recommendations) and therefore foster a quality culture. • rankings are externally driven and only state the current situation of an institution in comparison to other institutions on the basis of selected indicators.

  18. Conclusions .. • QA and evaluations usually take into account the variety of missions (diversity of HE) and processes behind the indicators • Rankings measure the performance of an institution against a certain (ideal) model of an institution reflected in the choice of selective indicators by the compilers • Whilst the compiler may use objective indicators, combining these indicators is always subject to judgement and hence subjective

More Related