The role of citation analysis in research evaluation
This presentation is the property of its rightful owner.
Sponsored Links
1 / 34

THE ROLE OF CITATION ANALYSIS IN RESEARCH EVALUATION PowerPoint PPT Presentation


  • 72 Views
  • Uploaded on
  • Presentation posted in: General

THE ROLE OF CITATION ANALYSIS IN RESEARCH EVALUATION. Philip Purnell September 2010. HOW DO WE EVALUATE RESEARCH?. Research grants Number and value Prestigious awards Nobel Prizes Patents Demonstrating innovative research Faculty Number of post-graduate researchers Citation analysis

Download Presentation

THE ROLE OF CITATION ANALYSIS IN RESEARCH EVALUATION

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


The role of citation analysis in research evaluation

THE ROLE OF CITATION ANALYSIS IN RESEARCH EVALUATION

Philip Purnell

September 2010


How do we evaluate research

HOW DO WE EVALUATE RESEARCH?

  • Research grants

    • Number and value

  • Prestigious awards

    • Nobel Prizes

  • Patents

    • Demonstrating innovative research

  • Faculty

    • Number of post-graduate researchers

  • Citation analysis

    • Publication and citation counts

    • Normalised by benchmarks

  • Peer Evaluation

    • Expensive, time consuming and subjective


A brief history of the citation index

A BRIEF HISTORY OF THE CITATION INDEX

Concept first developed by Dr Eugene Garfield

Science, 1955

The Science Citation Index (1963)

SCI print (1960’s)

On-line with SciSearch in the 1970’s

CD-ROM in the 1980’s

Web interface (1997) Web of Science

Content enhanced:

Social Sciences Citation Index (SSCI)

Arts & Humanities Citation Index (AHCI)

The Citation Index

Primarily developed for purposes of information retrieval

Development of electronic media and powerful searching tools have increased its use and popularity for purposes of Research Evaluation


Web of science journal selection policy

WEB OF SCIENCE JOURNAL SELECTION POLICY

Why do we select journals?


Why not index all journals

WHY NOT INDEX ALL JOURNALS?

  • 40% of the journals:

  • 80% of the publications

  • 92% of cited papers

  • 4% of the journals:

  • 30% of the publications

  • 51% of cited papers


How to decide which journals to index

HOW TO DECIDE WHICH JOURNALS TO INDEX

  • Approx. 2000 journals evaluated annually

    • 10-12% accepted

  • Thomson Reuters editors

    • Information professionals

    • Librarians

    • Experts in the literature of their subject area

Journal ‘quality’

Web of Science

Journals under evaluation


Thomson reuters journal selection policy

THOMSON REUTERSJOURNAL SELECTION POLICY

  • Publishing Standards

    • Peer review, Editorial conventions

  • Editorial content

    • Addition to knowledge in specific subject field

  • Diversity

    • International, regional influence of authors, editors, advisors

  • Citation analysis

    • Editors and authors’ prior work


The role of citation analysis in research evaluation

GLOBAL RESEARCH REPRESENTATION

WEB OF SCIENCE COVERAGE


Summary consistency is the key to validity

SUMMARYCONSISTENCY IS THE KEY TO VALIDITY

  • Analyses based on authoritative, consistent data from the world’s leading provider of Research Evaluation solutions

  • Thomson Reuters has developed a selectionpolicyoverthelast 50 yearsdesignedtohand-pick therelevantjournalscontainingthecorecontentoverthe full range of scholarly disciplines

  • This has created a large set of journalscontaining comparable papers and citations

  • Thomson Reuters has alwayshadoneconsistent editorial policytoindexalljournalscover-to-cover, indexallauthors and indexalladdresses. Thisuniqueconsistencymakes Web of Sciencetheonlysuitable data sourceforcitationanalysis


Governments and institutions using tr data for evaluation incl

GOVERNMENTS AND INSTITUTIONS USING TR DATA FOR EVALUATION (INCL.)

Germany: IFQ, Max Planck Society, DKFZ, MDCUS

Netherlands: NWO & KNAW

France: Min. de la Recherche, OST - Paris, CNRS

United Kingdom: King’s College London; HEFCE

European Union: EC’s DGXII(Research Directorate)

US: NSF: biennial Science & Engineering Indicators report (since 1974)

Canada: NSERC, FRSQ (Quebec), Alberta Research Council

Australian Academy of Science, gov’t lab CSIRO

Japan: Ministry of Education, Ministry of Economy, Trade & Industry

People’s Republic of China: Chinese Academy of Science

Times Higher Education: World University Rankings (from 2010)


Evaluating countries

EVALUATING COUNTRIES


Scientific research impact in central europe

SCIENTIFIC RESEARCH IMPACT IN CENTRAL EUROPE

Thomson Reuters InCites


Output and productivity bulgarian research 1998 2008

OUTPUT AND PRODUCTIVITYBULGARIAN RESEARCH 1998 - 2008


Comparative impact in selected fields between countries

COMPARATIVE IMPACT IN SELECTED FIELDSBETWEEN COUNTRIES

Source: Thomson Reuters InCites


Bulgarian research relative productivity by field

BULGARIAN RESEARCH RELATIVE PRODUCTIVITY BY FIELD

22% Bulgarian papers are in Chemistry

<1% Bulgarian papers are in Psychiatry

Source: Thomson Reuters InCites


Evaluating institutions

EVALUATING INSTITUTIONS


Evaluating institutions1

EVALUATING INSTITUTIONS

Source: Thomson Reuters

North America University Science Indicators


Citations per paper mathematics

CITATIONS PER PAPERMATHEMATICS

Source: Thomson Reuters InCites


Comparison of top mathematics institutes around the world

COMPARISON OF TOP MATHEMATICS INSTITUTES AROUND THE WORLD

Source: Thomson Reuters InCites


With whom does our faculty collaborate

WITH WHOM DOES OUR FACULTY COLLABORATE?

Source: Thomson Reuters InCites


Which collaborations are the most valuable

WHICH COLLABORATIONS ARE THE MOST VALUABLE?

Collaborations with these institutions have produced highly cited papers within their subject fields

Source: Thomson Reuters InCites


Evaluating journals

EVALUATING JOURNALS


Calculating 2009 impact factor journal of contaminant hydrology

Citations in 2009

To items published in 2008 = 153

To items published in 2007 = 239

Sum = 392

Number of items

Published in 2008 = 97

Published in 2007 = 98

Sum = 195

CALCULATING 2009 IMPACT FACTOR - JOURNAL OF CONTAMINANT HYDROLOGY  

392

195

= 2,01


Journal impact factor selected chemistry journals

JOURNAL IMPACT FACTORSELECTED CHEMISTRY JOURNALS

Thomson Reuters Journal Citation Reports


Using the impact factor evaluating journals

USING THE IMPACT FACTOREVALUATING JOURNALS

  • Appropriate use

    • To evaluate journals within a subject field

  • Misuse

    • Comparison of journals from different fields

    • Evaluation of individual articles

    • Evaluation of institution or researcher


Using the impact factor misuse evaluating individual papers

USING THE IMPACT FACTOR MISUSE: EVALUATING INDIVIDUAL PAPERS

30% of articles in Food Policy were not cited at all

Journal Impact Factor = 2,01


Benchmark your papers against global averages is this a highly cited paper

BENCHMARK YOUR PAPERS AGAINST GLOBAL AVERAGES – IS THIS A HIGHLY CITED PAPER?

This article is ranked in the 12,92nd percentile in its field by citations

Articles published in ‘Blood’ from 2004 have been cited 34,30 times

Hematology articles from this year have been cited 18,83 times

This paper has received 40/34,30=1,17 times the expected citations for this journal

This paper has received 40/18,83=2,12 times the expected citations for this subject category


Evaluating individuals

EVALUATING INDIVIDUALS


How can we compare researchers

HOW CAN WE COMPARE RESEARCHERS?

Author A: 60 papers

Author B: 117 papers

Source: Thomson Reuters InCites


Obtain multiple measures

OBTAIN MULTIPLE MEASURES


Recognize the skewed nature of citation data

RECOGNIZE THE SKEWED NATURE OF CITATION DATA

  • Citation distribution is always skewed

    • Few highly cited papers

    • Majority cited little or not at all

  • Distribution type

    • Always distorted

    • Human decision

      • E.g. Criticality


Summary i treat as a scientific study

SUMMARY (I): TREAT AS A SCIENTIFIC STUDY

  • Ask whether the results are reasonable

  • Follow scientific process for evaluating data

  • Apply scientific skepticism


Summary ii how do we evaluate research

SUMMARY (II): HOW DO WE EVALUATE RESEARCH?

  • Research grants

    • Number and value

  • Prestigious awards

    • Nobel Prizes

  • Patents

    • Demonstrating innovative research

  • Faculty

    • Number of post-graduate researchers

  • Citation analysis

    • Publication and citation counts

    • Normalised by benchmarks

  • Peer Evaluation

    • Expensive, time consuming and subjective


Thank you

THANK YOU

Philip Purnell

September 2010


  • Login