1 / 16

Innovation for Growth – i4g

Innovation for Growth – i4g. Universities are portfolios of (largely heterogeneous) disciplines. Further problems in university rankings Warsaw, 16 May 2013 Andrea Bonaccorsi University of Pisa and ANVUR I4G Expert group , European Commission

anoush
Download Presentation

Innovation for Growth – i4g

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Innovation for Growth – i4g Universities are portfolios of (largely heterogeneous) disciplines. Further problems in university rankings Warsaw, 16 May 2013 Andrea Bonaccorsi University of Pisa and ANVUR I4G Expert group, EuropeanCommission In collaboration with Tindaro Cicero, University of Roma Tor Vergata Luca Secondi, Uninettuno Enza Setteducati, ANVUR

  2. Debate on universityrankings in the Europeancontext • Largelyused by governments and decisionmakersbutalsocriticized: • biasedtowards large and establisheduniversities • biasedtowards medicine and science • non-neutral with respect to disciplinaryspecialization (broadfields vs niches) • large impact of few top journals • correlationamongindividualcomponents of composite indicators • single source of bibliometric data (ISI Thomson) • no statisticalrepresentativeness of surveys • monodimensionality of rankings

  3. Twolines of action • Multi-dimensional ranking • combine research with otherdimensions of universityactivity • survey-based • customizedrankingsbased on the selection of indicators • large effort in progress • Multi-disciplinaryresearchbenchmarking • onlyresearch • buttransparent in the disciplinary mix • differentrankingsavailabledepending on the choice of weights • onlyexperimental

  4. Multi-disciplinaryresearchbenchmarking • Use alternative bibliometric data source • Allowdisaggregation by scientific discipline at fine-grainedlevel (i.e. individualscientificfields) • Build up measures of overallcompetitivenessas bottom up aggregation of performance in individualdisciplines • Combine quantity (= volume of publications), impact (= number of citations ) and quality (= share of publications in high qualityjournals) • Allowbenchmarking of individualuniversities • Transparency in weights • Allow fine tuning of weights in composite indicators • Allow multi-dimensionalityas a construction of several, alternative, non-commensurablemeasures

  5. Global Research Benchmarking System Over 24,000 source titles of types Journal, Conference Proceedings, and Book Series from Elsevier's Scopus database. Period covered 2007-2010 (4 year window) Over 250 disciplinary and interdisciplinary subject areas 1337 universities from Asia-Pacific, North America (USA and Canada) and Europe. New release 2013 covering the 2008-2011 period just announced! www.researchbenchmarking.org

  6. Indicators available at GRBS • For each subject area • Number of publications • Percentage of publications in top source titles • top 10% journals • top 25% journals • Number of citations • Percentage of citations from top source titles • top 10% journals • top 25% journals • H-index 2007-2010 • Percentage publications from international collaborations • Percentage citations to publications from international collaborations • Top source titles are determined by their SNIP values

  7. Using different indicators may lead to different rankings Example: Information systems Our metrics Number of cites in top 10% SNIP vs. Total number of cites

  8. Construction of a composite indicator • Indicators chosen to provide a balanced measure of key dimensions of research performance: output, scholarly impact, volume, quality. • Number of publications • Percentage publications in top source titles • top 10% journals • top 25% journals • Number of citations • 4 year H-index • Percentage of citations from top source titles • top 10% journals • top 25% journals

  9. Composite indicator Each of the 7 dimensions is weighted equally This composite indicator gives large importance to quality indicators expressed in percentage, and then independent on absolute size. This is a major departure, among many other substantive differences, from existing rankings that implicitly place weight to absolute size of universities. At the same time, given the correlation between percentages of publications and citations in the top 10% and 25%, respectively, this measure gives visibility to excellence as measured by the ability to compete for good journals.

  10. Remarks • Twodefinitions of excellence- top 10% and top 30% • Ranking of universitiesisdone by counting the number of disciplinaryfields in which a universityispresent in the top 10% or 30% • Unweighted • Weighted • Caveats • Sizematters • Thresholdat 50 publications per field in 4 years • No visibility of small butexcellentinstitutions (e.g. Ecole Normale in France, Scuola Normale or SISSA in Italy) • Language matters (English languagebias) • Correlationbetween 10% and 25% SNIP indicators (publications and citations) • No humanities and social sciences • Granularity of classification of disciplinesmay be differentacrossfields • No Public ResearchOrganizations- data shouldnot be interpretedas an evaluation of national public researchsystems • No national Science Academies

  11. Distribution of universities by number of fields (unweighted) in top 10%,by region

  12. Number of universities in top 10% and number of fields by country. EU 27 + Norway and Switzerland

  13. Distribution of universities by number of fields (unweighted) in top 30%,by region

  14. Scientific excellence at top 30% by European country Source: Innovation 4 Growth (2013) based on Global ResearchBenchmarking

More Related