1 / 27

Open Access and the assessment of research quality

Stevan Harnad University of Quebec at Montreal University of Southampton. Open Access and the assessment of research quality. Basic metric types. User (reader)-generated data = USAGE -based metrics Author-generated data = CITATION -based metrics. Rankings (institutions).

donar
Download Presentation

Open Access and the assessment of research quality

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Stevan Harnad University of Quebec at Montreal University of Southampton Open Access and the assessment of research quality

  2. Basic metric types • User (reader)-generated data = USAGE-based metrics • Author-generated data = CITATION-based metrics

  3. Rankings (institutions) • Academic Ranking of World Universities (“Shanghai” Ranking) • THES World University Ranking • G-factor Ranking • Webometrics Ranking of World Universities

  4. Rankings (journals) • Web of Science • SCImago

  5. Existing USAGE measures • COUNTER statistics (per publisher, per journal, per site) • MESUR (LANL): using SFX logs in CA • On repositories: • Citeseer • LogEc • Google Analytics • AWstats • IRS

  6. Southampton’s ECS repository

  7. Repository usage: daily downloads

  8. Repository usage: monthly downloads

  9. Interoperable Repository Statistics

  10. Single eprint: daily downloads

  11. Single eprint: monthly downloads

  12. Single eprint: referrers

  13. Single eprint: search terms

  14. Academic users

  15. Every e-print tells a story… Link placed on “Canonical correlation” page in Wikipedia NIPS Workshop linked to this eprint from its web page

  16. Existing IMPACT measures (individuals) • Number of articles / other outputs • Number of citations • h-index (Hirsch) • g-index (Egghe) – plus other modifications of the h-index • eigenfactor • Y-factor (LANL) • Series of weighted indices in a multiple regression equation (Harnad et al)

  17. Other quantifiable impact (related) measures • Immediacy • Latency • Decay rate • Authority (‘fan in’) • Hub (‘fan out’) • Co-citation • Cited by • Citing rank (cites from high-ranking journals) • Semiometrics (semantic distance) Harnad, 2006, 2007

  18. OAR2008

  19. Sample citation and download growth with time. (Downloads only start in 2005 because that is when this paper was deposited.) Early growth rate and late decay metrics for downloads and citations can also be derived.

  20. Metrics validation • Citation-related metrics have in general not yet been systematically face-validated • Benchmarks are still up for grabs • N.B. Early citation counts predict later ones • N.B. Download counts predict citations (even earlier)

  21. RAE 2001: Ranking for psychology departments OAR2008

  22. Correlation of citations and peer review • Correlation between RAE ratings and mean departmental citations +0.91 (1996) +0.86 (2001) (Psychology) • RAE and citation counting measure broadly the same thing • Citation counting is both more cost-effective and more transparent Source: Eysenck

  23. Using the UK’s RAE • UK’s RAE 2008 will be a parallel panel/metric exercise • Possible to develop a rich spectrum of candidate metrics • Validate each metric against the panel rankings, discipline by discipline, through multiple regression analysis • Determining and calibrating the weights on each metric

  24. Thank you for listening harnad@ecs.soton.ac.uk

More Related