1 / 23

OVERCOMING APPLES TO ORANGES: ACADEMIC RESEARCH IMPACT IN THE REAL WORLD

OVERCOMING APPLES TO ORANGES: ACADEMIC RESEARCH IMPACT IN THE REAL WORLD. Netspeed 2014. Thane Chambers Faculty of Nursing & University of Alberta Libraries Leah Vanderjagt University of Alberta Libraries. Welcome. Who we are What we’ll cover

Download Presentation

OVERCOMING APPLES TO ORANGES: ACADEMIC RESEARCH IMPACT IN THE REAL WORLD

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. OVERCOMING APPLES TO ORANGES: ACADEMIC RESEARCH IMPACT IN THE REAL WORLD Netspeed 2014 Thane Chambers Faculty of Nursing & University of Alberta Libraries Leah Vanderjagt University of Alberta Libraries

  2. Welcome • Who we are • What we’ll cover • What is the broad context for research performance metrics need in post-secondary? • What’s are some issueswith research performance metrics? • How can we improveresearch performance assessment?

  3. What is this all about? A brief history of research evaluation

  4. Fundamental principles of research metrics: • Single indicators are misleading on their own • Integration of both qualitative & quantitative data is necessary • Various frameworks for research performance already exist

  5. Fundamental principles of research metrics: • Times cited ≠ Quality • Discipline to discipline comparison is inappropriate • Citation databases cover certain disciplines better than others

  6. Broad/Global Context • Recent upswing in interest about research evaluation • Nationwide assessments in UK, Australia, New Zealand • An audit culture is growing

  7. Metrics wanted • For what? • Performance (strength and weakness) • Comparison (with other institutions) • Collaboration (potential and existing) • Traditional metrics • Altmetrics

  8. Who’s in the game: consumers • Senior university administrators • Funding agencies • Institutional funders • Researchers • Librarians

  9. Who’s in the game: producers (vendors) • Elsevier: Scopus and SciVal • Thomson Reuters: Web of Science and InCites • Digital Science: Symplectic Elements • Article-level metrics (altmetrics) solutions

  10. Vendor Claims • Quick, easy, and meaningful benchmarking • Ability to devise optimal plans • Flexibility • Insightful analysis to identify unknown areas of research excellence …...All with a push of a button!

  11. What do we find when we test these claims?

  12. What’s needed: Persistent Identifiers • Without DOIs, how can impact be tracked? • ISBNs, repository handles • Disciplinary and geographic differences in DOI coverage: DOI assignment costs $$ • What about grey literature? • Altmetrics still may depend on DOI

  13. WHAT’S NEEDED: NAME DISAMBIGUATION (THE BIGGEST PROBLEM)

  14. What’s needed: Source Coverage • Source coverage in most prominent products are still Scopus and Web of Science (STEM-heavy) • Integration of broader sources is packaged with more expensive implementations • Some products specifically market broad source coverage (Symplectic Elements)

  15. What’s needed: National Subject Area Classification (TO A FINE LEVEL) • Subject classification in products is EXTREMELY broad - so broad, comparisons are inappropriate • Integration of a national standard of granular subject classification would help everyone

  16. SUBJECT CLASSIFICATION example http://www.rcuk.ac.uk/RCUK-prod/assets/documents/documents/ResearchAreasProposalClassificationsList.pdf

  17. What’s needed: Training & Knowledge • Do all CONSUMERS want/need training? • Have we analyzed our services for citation impact and metrics analysis? • Top-to-bottom organizational training couched in strategic needs for metrics identified

  18. What’s needed: Processes & Workflows • Data cleaning • Verification of new data • Running analysis • Verifying analysis

  19. What’s needed: Cultural Understanding • How is the data going to be used? And who will be rewarded? • An audit culture • Social sciences, humanities, arts would have justified concerns with the adoption of tools that are citation based

  20. How can academic libraries help? • Share our knowledge of best practices/other effective implementations • Challenge vendors to address problems • Train for author ID systems and assignment and integrate author IDs with digital initiatives

  21. How can academic libraries help? • Advocate for national comparison standards (CASRAI) • Employ our subject-focused outreach model • As a central unit, make broad organizational connections to help with implementation • Promote our expertise: bibliographic analysis is an LIS domain

  22. Recommendations • Strategic leaders need to initiate university wide conversations about what research evaluation means for the institution • Tools need to be flexible to incorporate non-Journal based scholarly work/data • New workflowsneed to be minimized and incorporated into existing workflowsas much as possible • Broad adoption of ORCID ID system

  23. References Marjanovic, S., Hanney, S., & Wooding, S. (2009). A Historical Reflection on Research Evaluation Studies, Their Recurrent Themes and Challenges. Technical Report. RAND Corporation. Moed, H.F. (2005). Citation analysis in research evaluation. Dordrecht: Springer.

More Related