1 / 22

U-Multirank – The implementation of a multidimensional international ranking

U-Multirank – The implementation of a multidimensional international ranking. Don Westerheijden, Frank Ziegele. Higher Education Conference Rankings and the Visibility of Quality Outcomes in the European Higher Education Area  Dublin, 30-31 January 2013.

hagen
Download Presentation

U-Multirank – The implementation of a multidimensional international ranking

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. U-Multirank – The implementation of a multidimensional international ranking Don Westerheijden, Frank Ziegele Higher Education Conference Rankings and the Visibility of Quality Outcomes in the European Higher Education Area  Dublin, 30-31 January 2013 Team Leaders: Gero Federkeil, Jon File, Frans van Vught, Frank Ziegele

  2. Why one more ranking? – The purpose of U-Multirank • Policy purposes • Transparency (on diversity) about European Higher Education Area - • triad of projects: EUMIDA, e3m, U-Multirank • Benchmarking with Non-European higher education (Modernisation Agenda) • Stakeholder related purposes, e.g. • Students: informed choices • Institutions: strategy development through comparison & benchmarking • Policy makers: diversity/performance of systems • Employers: partners for cooperation

  3. How can those purposes be achieved? - The basic approach • Multidimensional ranking - Going beyond the traditional focus on research excellence • 5 dimensions: Teaching & learning, research, knowledge transfer, international orientation, regional engagement • No composite indicators, no pre-defined weights on individual indicators • User-driven ranking • Personalised ranking allows users to rank by their own preferences and priorities on dimensions and indicators • flexible web tool

  4. How can those purposes be achieved? - The basic approach • Comparing like with like • Link to mapping indicators allowing identification of institutions with similar institutional profiles • Multi-level ranking • Combining institutional ranking (whole institutions) and field-based rankings • Stakeholder-oriented processes • Intensive inclusion of stakeholders in development and continuous refinement of U-Multirank

  5. With this approach U-Multirank will create performance profiles respecting mission diversity Teaching & learning Research Knowledge transfer • Art-relatedoutput • Expenditure research • Citations • Publications + hi-cited • Interdisciplinary research • International awards • Number of post-docs • Competitive research income • Expenditure teaching • Graduationrate • Interdisciplinaryprogrammes • Relativegraduateemployment • Time todegree • Incentives • University-industrypublications • Third-party funding • Patents • Size of TTO • CPD courses offered • Co-patents • Spin offs International orientation Regional engagement • Programmes in foreign tongue • Academicstaff • PhD graduations • Joint research publi’s • Joint degree progr. • Graduates in region • Incomefromregion • Research publi’s • Contractsregion • Regional student internships exampleinstitutionallevel

  6. The feasibility study proved that U-Multirank works • Feasibility study (2009 – 2011) • Pilot study with 150 HEIs, institutional ranking and 3 pilot fields (mechanical /electrical engineering, business) • Some general results: • indicators and processes of data collection and analysis are feasible • good response from most countries • “pre-filling” options, but institutional data collection inevitable • need for further refinement of indicators on knowledge transfer, regional engagement, employability/labour market

  7. The real implementation has just started • 2012-2014: New project to implement U-Multirank and to develop a sustainable business model • Kick-off Presidency Conference Dublin • two years + option for another two years • First ranking to be published in February 2014 • Institutional ranking and four fields (pilot fields + physics) • Minimum coverage of 500 institutions • 2015 > gradual extension by number of institutions and fields • special task: feasibility study on inclusion of research-performing institutions (non-university research)

  8. U-Multirank is done by a consortium of partners combining different functions and expertise • Coordination/lead and rankings • CHE Centre for Higher Education • CHEPS Center for Higher Education Policy Studies • Partners • Data collection: CWTS Center for Science and Tehnology Studies, U Leiden, Incentim: International Centre for Research on Entrepreneurship, Technology and Innovation Management, KU Leuven • Web tool experts: folge3, Johnny Rich (push) • Business Model: Elsevier, Bertelsmann Foundation • Associate partners • National rankings: OST (France); Perspektywy (Poland), Fundación CYD (Spain) • Stakeholder Organisations: ESU, CESAER, IRUN, UASNet

  9. The U-Multirank tool will provide substantial benefits to users • For prospective / mobile students • Selection of field of study • Guided journey to identify personal preferences / priorities • Personalised selection of indicators •  Result: shortlist of institutions matching to personal preferences, ranking outcomes for personally relevant indicators • For higher education institutions • Selection of institutional profiles (mapping indicators) to compare like with like • Compare all/selected indicators or one dimension (institutional + field-based for all faculties ) • See and compare full institutional performance profiles • Additional benefits to participating institutions: • Detailed analysis of their data compared to averages total sample • Option of creating benchmarking networks • services such as widgets for the website

  10. The U-Multirank tool will provide substantial benefits to users • For policy makers • Identify and compare performance profiles of national institutions, not only research universities • Insight into institutional diversity • Performance of national HE institutions in international context • Benchmarking of national systems, probably also for specific dimensions/indicators • crucial: a user-friendly and flexible web tool which provides comfortable “user journeys” and sufficient guidance

  11. Illustration of user-journey (student): We could take a first look in the webdesigners’ laboratory……

  12. …and start user journeys by defining an institutional profile or by selecting a particular university…

  13. …defining an institutional profile to compare like with like… onlyexamples, othercriteriapossible

  14. …leading to the result of a ranking of those institutions…

  15. … with the possibility to personalise the ranking by selecting indicators according to personal preferences exampleforchoice of indicators

  16. Example for choice of indicators: Personalisation depends on individual preferences personaliseyourranking: choose relevant indicatorsfromyourperspective want quick study graduation rate within time wantsmallgroups student-staffratio wantsatisfiedstudents overallsatisfaction wantpracticallyorientedteachers profswithworkexperience outside probably will goforPh.D doctorateproductivity newlistwithchosenindicators, personal ranking!

  17. There are also typical “user journeys” from the perspective of HEIs, for instance for rectors

  18. There are also typical “user journeys” from the perspective of HEIs, for instance for rectors howismyuniversityperformingagainsttheothersthataresimilartomine? showtheoverallperformanceprofile of myuniversity filter theuniversitieswithsimilaractivityprofile compareperformances: overview on all dimension compareperformances: select i ndicators/dimensionswithhighrelevanceforinstitutionalstrategy use of benchmarkingresultsforinternaldiscussions, internalanalysis of reasons, strategicdevelopment

  19. What will be the next steps? - Recruitment of institutions PHASE 1 individual expression of interest 150 pilotinstitutions partnernetworks (CESAER, UASnet, IRUN) research intensive universities (bibliometrics) PHASE 2: country-specificrecruitment (getting a certainamountandscope per country on board) PHASE 3: targetedrecruitmenttoensureadequatelybalanced sample forthefirstranking

  20. What will be the next steps? - Consultations and data collection • Stakeholder consultations • On refined indicators (with partners and stakeholder/field organisations) – starting in February • On the web tool (information needs, functions, presentation modes) – starting April (until April: development of prototype) • Data collection • starting May/June 2013 • Continuous communication • transparency about all steps and activities • responsiveness channels • presentations (contact us if needed!) • media partners

  21. Information / Contact • Information about U-Multirank • www.u-multirank.eu • Final report of thefeasibilitystudy • http://ec.europa.eu/education/higher-education/doc/multirank_en.pdf • Contact/Expression of interest in participation • info@u-multirank.eu

  22. U-Multirank – The implementation of a multidimensional international ranking Don Westerheijden, Frank Ziegele Higher Education Conference Rankings and the Visibility of Quality Outcomes in the European Higher Education Area  Dublin, 30-31 January 2013 Team Leaders: Gero Federkeil, Jon File, Frans van Vught, Frank Ziegele

More Related