Research evaluation at cwts meaningful metrics evaluation in context
This presentation is the property of its rightful owner.
Sponsored Links
1 / 47

Research evaluation at CWTS Meaningful metrics, evaluation in context PowerPoint PPT Presentation


  • 79 Views
  • Uploaded on
  • Presentation posted in: General

Research evaluation at CWTS Meaningful metrics, evaluation in context. Ed Noyons, Centre for Science and Technology Studies, Leiden University RAS Moscow, 10 October 2013. Outline. Centre of science and Technology Studies (CWTS, Leiden University) history in short; CWTS research program;

Download Presentation

Research evaluation at CWTS Meaningful metrics, evaluation in context

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Research evaluation at cwts meaningful metrics evaluation in context

Research evaluation at CWTSMeaningful metrics, evaluation in context

Ed Noyons, Centre for Science and Technology Studies, Leiden University

RAS Moscow, 10 October 2013


Outline

Outline

  • Centre of science and Technology Studies (CWTS, Leiden University) history in short;

  • CWTS research program;

  • Recent advances.


History in short

25 years CWTS

History in Short

3


25 years cwts history in short 1985 2010

25 years CWTS history in short (1985-2010)

  • Started around 1985 by Anthony van Raan and HenkMoed; One and a half person funded by university;

  • Context is science policy, research management;

  • Mainly contract research and services (research evaluation);

  • Staff stable around 15 people (10 researchers);

  • Main focus on publication and citation data (in particular Web of Science).


25 years cwts history in short 2010

25 years CWTS history in short (2010 - …)

  • Block funding since 2008;

  • Since 2010

    • moving from Services mainly with some research to:

    • Research institute with services;

    • New director Paul Wouters;

  • New recruitments: now ~35 people.


Cwts research programme

Research and services

CWTS Research programme


Bibliometrics in context science policy is

Bibliometrics (in context science policy) is ...


Opportunities

Opportunities

  • Research Accountability => evaluation

  • Need for standardization, objectivity

  • More data available


Vision

Vision

  • Quantitative analyses

  • Beyond the ‘lamppost’

    • Other data

    • Other outputs

  • Research 360º

    • Input

    • Societal impact/quality

    • Researchers themselves


Background of the cwts research program

Background of the CWTS research program

  • Already existing questions

  • New questions:

    • How do scientific and scholarly practices interact with the “social technology” of research evaluation and monitoring knowledge systems?

    • What are the characteristics, possibilities and limitations of advanced metrics and indicators of science, technology and innovation?


Current cwts research organization

Current CWTS research organization

  • Chairs

    • Scientometrics

    • Science policy

    • Science Technology & innovation

  • Working groups

    • Advanced bibliometrics

    • Evaluation Practices in Context (EPIC)

    • Social sciences & humanities

    • Society using research Evaluation (SURE)

    • Career studies


Back to bibliometrics

Alook under the lamp post

Back to Bibliometrics


Recent advances at cwts

Recent advances at CWTS

  • Platform: Leiden ranking

  • Indicators: New normalization to address:

    • Multidisciplinary journals

    • (Journal based) classification

  • Structuring and mapping

    • Advanced network analyses

    • Publication based classification

    • Visualization: VOSviewer


The leiden ranking

http://www.leidenranking.com

The Leiden Ranking

14


Platform leiden ranking http www leidenranking com

Platform: Leiden Ranking http://www.leidenranking.com

  • Based on Web of Science (2008-2011);

  • Only universities (~500);

  • Only dimension is scientific research;

  • Indicators (state of the art):

    • Production

    • Impact (normalized and‘absolute’)

    • Collaboration.


Leiden ranking world top 3 pptop10

Leiden Ranking – world top 3 (PPtop10%)

PPtop10%:

Normalized impact

Stability:

Intervals to enhance certainty


Russian universities impact

Russian universities (impact)


Russian universities collaboration

Russian universities (collaboration)


Impact normalization mncs

Dealing with field differences

Impact Normalization (MNCS)

19


Background and approach

Background and approach

  • Impact is measured by numbers of citations received;

  • Excluding self-citations;

  • Fields differ regarding citing behavior;

  • One citation is one field is more worth than in the other;

  • Normalization

    • By journal category

    • By citing context.

20


Issues related to journal category based approach

Issues related to journal category-based approach

  • Scope of category;

  • Scope of journal.


Journal classification challenge scope of category e g cardio research

Journal classification ‘challenge’(scope of category) (e.g. cardio research)


Approach source normalized mncs

Approach Source-normalized MNCS

  • Source normalization (a.k.a. citing-side normalization):

    • No field classification system;

    • Citations are weighted differently depending on the number of references in the citing publication;

    • Hence, each publication has its own environment to be normalized by.

23


Source normalized mncs cont d

Source-normalized MNCS (cont’d)

  • Normalization based on citing context;

  • Normalization at the level of individual papers (e.g., X)

  • Average number of refs in papers citing X;

  • Only active references are considered:

    • Refs in period between publication and being cited

    • Refs covered by WoS.

24


Networks and visualization

Collaboration, connectedness, similarity, ...

Networks and visualization

25


Vosviewer collaboration lomonosov moscow state university msu

VOSviewer: collaboration Lomonosov Moscow State University (MSU)

  • WoS (1993-2012)

  • Top 50 most collaborative partners

  • Co-published papers


Other networks

Other networks

  • Structure of science output (maps of science);

  • Oeuvres of actors;

  • Similarity of actors (benchmarks based on profile);


Publication based classification

Publication based classification

Structure of science independent from journal classification

28


Publication based classification wos 1993 2012

Publication based classification (WoS 1993-2012)

  • Publication based clustering (each pub in one cluster);

  • Independent from journals;

  • Clusters based on Citing relations between publications

  • Three levels:

    • Top (21)

    • Intermediate (~800)

    • Bottom (~22,000)

  • Challenges:

    • Labeling

    • Dynamics.


Map of all sciences 784 fields wos 1993 2012

Map of all sciences (784 fields, WoS 1993-2012)

Each circle represents a cluster of pubs

Colors indicate clusters of fields, disciplines

Social and health sciences

Cognitive sciences

Maths, computer sciences

Biomed sciences

Physical sciences

Earth, Environ, agricult sciences

Distance represents relatedness

(citation traffic)

Surface represents volume


Positioning of an actor in map

Positioning of an actor in map

  • Activity overall (world and e.g., LomonosovMoscow State Univ, MSU)

    • Proportion Lomonosov relative to world;

  • Activity per ‘field’ (world and MSU)

    • Proportion MSU in field;

  • Relative activity MSU per ‘field’;

  • Scores between 0 (Blue) and 2 (Red);

  • ‘1’ if proportion same as overall (Green).


Positioning lomonosov msu

Positioning LomonosovMSU


Positioning lomonosov msu1

Positioning LomonosovMSU


Positioning russian academy of sciences ras

Positioning Russian Academy of Sciences (RAS)


Alternative view lomonosov density

Alternative view Lomonosov (density)


Using the map benchmarks

Using the map: benchmarks

  • Benchmarking on the basis of research profile

    • Distribution of output over 784 fields;

  • Profile of each university in Leiden Ranking;

    • Distributions of output over 784 fields;

  • Compare to MSU profile;

  • Identify most similar.


Most similar to msu lr universities

Most similar to MSU (LR) universities

  • FR - University of Paris-Sud 11

  • RU - Saint Petersburg State University

  • JP - Nagoya University

  • FR - Joseph Fourier University

  • CN - Peking University

  • JP - University of Tokyo


Density view msu

Density view MSU


Density view st petersburg state university

Density view St. Petersburg State University


Vosviewer visualization of similarities http www vosviewer com

VOSviewer (Visualization of Similarities)http://www.vosviewer.com

  • Open source application;

  • Software to create maps;

  • Input: publication data;

  • Output: similarities among publication elements:

    • Co-authors

    • Terms co-occurring

    • Co-cited articles


More information cwts and methods

More information CWTS and methods

  • www.cwts.nl

  • www.journalindicators.com

  • www.vosviewer.com

  • [email protected]


Thank you

THANK YOU


Basic model in which we operate research evaluation

Basic model in which we operate (research evaluation)

  • Research in context


Example 49 research communties of a fi univ

Example (49 Research communties of a FI univ)

‘Positive’ effect

‘Negative’ effect


Rc with a positive effect

RC with a‘positive’effect

  • Most prominent field

  • Impact increases


Rc with a negative effect

Rc with a‘negative’ effect

  • Most prominent field

  • Impact same

  • Less prominent field

  • Impact decreases


Wrap up normalization

Wrap up Normalization

Normalization based on journal classification has its flaws;

We have developed recently an alternative;

Test sets in recent projects show small (but relevant) differences;


  • Login