1 / 36

InCites TM

InCites TM. rachel.mangan@thomsonreuters.com http://researchanalytics.thomsonreuters.com/incites/. InCites Overview. Annual Subscription to bibliometrics reports based on Web of Science

shellym
Download Presentation

InCites TM

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. InCitesTM rachel.mangan@thomsonreuters.com http://researchanalytics.thomsonreuters.com/incites/

  2. InCites Overview • Annual Subscription to bibliometrics reports based on Web of Science • 1 Research Performance Profile Report (RPP) containing articles extracted from WoS with associated metrics • Thomson Reuters staff support building dataset to client specifications • Dataset may be based on author, address, topic, or journal • Entire Global Comparisons (GC) sets containing aggregated comparative statistics for fields, countries, and institutions • National • Institutional by Region • Multiple subject schema

  3. InCites Overview • Web based interface as well as data on demand • Choices for delivery of a record set: FTP, MS Access • Frequent Updates: Quarterly for RPP, Annually for GC (Spring) • Access to TR Web Services: WS Premium, Researcher ID Services • Data preparation support and training • Ability to create custom reporting with sharing/saving data • Links to WoS records: Limited View for non-subscriber, Full View for subscribers • Licensed right to export reports and statistics about the institution for display on Institutional Web Site for promotional or information purposes • Licensed Right to load data to Institutional Repository or local database for institutional business purposes

  4. CRITICAL ANSWERS TO PRACTICAL QUESTIONS • What is the published output of my institution in various disciplines of study over the past 10 years? • What impact did this research have, how frequently has it been cited and where, by whom? • With which researchers and institutions is our faculty collaborating? • Which collaborations are most valuable, producing influential work – greatest return on investment? • How does my institution compare to our peer institutions in volume and influence of published work in particular fields? • Which programs within our institution perform best in terms of research output, producing research that is the most influential compared to other research in the world within those particular disciplines.

  5. THE DATA

  6. WEB OF SCIENCE Selectivity and control of content- high, consistent standards 11,000+ journals and 716 million+ cited references Multidisciplinary- Science, Social Science, Arts/Humanities Depth- 100+ years- including cited references Consistency and reliability- ideal for research evaluation e.g. field averages Unmatched expertise- 40+ years of citation analysis and research evaluation Conference Proceedings- 12,000 conferences annually Funding acknowledgments The gold standard-used by over 3,200 institutions in more than 90 countries

  7. SOURCE OF DATA SET: WEB OF SCIENCE RECORDS

  8. THE METRICS Absolute Counts Normalised metrics (for journal, document type, period, and category) Golden Rule: Compare like with like All document types included in RPP

  9. NO ALL PURPOSE INDICATOR This is a list of a number of different purposes a university might have for evaluating its research performance. Each purpose calls for particular kinds of information. Identify the question the results will help to answer and collect the data accordingly

  10. IS THIS A HIGH CITATION COUNT?

  11. CREATING A BENCHMARK-WHAT IS THE EXPECTED CITATION RATE TO SIMILAR PAPERS? Articles published ‘Monthly Notices of the Royal Astronomical Society’ published in 2007 have been cited on average 13.87 times. This is the expected count We compare the total citations received to a paper to what is expected 215 (Journal Actual) / 13.87 (Journal Expected) = 15.5 The paper has been cited 15.5. times more than expected. We call this Journal Actual/Journal Expected

  12. PERCENTILE IN FIELD- HOW MANY PAPERS IN THE TOP 1%, 5% OR 10% IN THEIR RESPECTIVE FIELDS? This is an example of the citation frequency distribution of a set of papers (author, journal, institution or subject category) . The papers are ordered none/least cited on the left, moving to the highest cited papers in the set on the right. We can assign each paper to a Percentile in the set. In any given set, there are always many low cited/ none cited papers (bottom 100%) In any given set, there are always few highly cited papers (top 1%) 100% 50% 0% Only article types article, note, and revieware used to determine the percentile distribution, and only those same article types receive a percentile value

  13. RESEARCH PERFORMANCE PROFILES • Executive Summary • Source Articles Listing • Author Ranking • Summary Metrics • Author Ranking with Self Citation Analysis • Collaborating Institutions • Field Rankings • Custom Reports – How to Generate an Author Ranking • for a Particular Field/Category • Citing Dataset (view citing authors, institutions and journals)

  14. SUMMARY METRICS

  15. SOURCE ARTICLE LISTING METRICS

  16. AUTHOR RANKING

  17. COLLABORATION AND RESEARCH NETWORK

  18. GLOBAL COMPARISIONS • Web of Science document types included: • Articles • Reviews

  19. Quick view of entire dataset

  20. Article Level Metrics

  21. Thomson Reuters value added metrics Basic bibliographic information about the article (including the field) Number of citations The Journal Impact Factor from the latest edition of the Journal Citation Reports

  22. Thomson Reuters value added metrics 2nd generation citation data, the articles that have cited the citing articles

  23. Thomson Reuters value added metrics Expected performance metrics. We calculate the number of citations a typical article would expect to receive. This is calculated for each Journal (JXC) and for each Category (CXC) these metrics are also normalized for the year and document type.

  24. Thomson Reuters value added metrics JXC Ratio (157 / 45.09) = 3.48 CXC Ratio (157 / 3.66) = 42.90 Although, it is not displayed on this screen, we also calculate the ratio between the actual and expected performance. This provides meaning and understanding of the citation counts and is a normalized performance measure.

  25. Thomson Reuters value added metrics The percentile. As compared for the set of documents in the same field and the same year. This paper is in the top 0.2% of all papers in “General & Internal Medicine” for the year 2007 The percentile is not calculated for all document types

  26. We generate summary metrics based on totals and averages of all the articles in the dataset Total citation counts, mean and median citations, 2nd generation total citation counts and mean 2nd generation citation counts Mean Actual / ExpectedCitation Ratio. Mean Percentile

  27. Author reports with self cites removed

  28. GLOBAL COMPARISONS

  29. COMPARE COUNTRY / TERRITORIES Data is available in tabular format with all metrics in one location Graphical summaries make the data easy to interpret In this example we can see the Citation Impact of Sweden in selected fields compared to the world

  30. COMPARE COUNTRY / TERRITORIES Normalized metrics are included for better understanding and relevant comparisons. In this example you can see the citation impact of selected countries normalized to the world average

  31. COMPARE COUNTRY / TERRITORIES Various regional groupings, such as EU or Asia Pacific are included.

  32. COMPARE COUNTRY / TERRITORIES There are different category classification schemes available. 250+ narrow categories from the Web of Science, 22 broad categories and 56 (including 6 broad) OECD classifications The inclusion of the OECD classification scheme makes for easy integration of InCites data with OECD data, such as R&D spending RAE 2008 Units of Assessment for RAE comparisions

  33. HOW IS OUR RESEARCH COMPARED TO OUR PEERS?Focus on Nanoscience & Nanotechnology: % Articles in Organization Citation Impact

  34. HOW IS OUR RESEARCH COMPARED TO OUR PEERS? Article count Normalized for field Make comparison’s with your peer institutions effectively and easily. In this example: Normalized metrics demonstrate that Stanford Univ and Univ Sydney have significant output in the field of Economics & Business this information would ordinarily be difficult to identify

  35. AGGREGATE PERFORMANCE INDICATOR DIFFERENT RESEARCH FOCUS Harvard’s research is heavily focused on the Life Sciences, which are generally highly cited fields Princeton’s research focus is more biased towards less cited subjects such as Space Science, Physics, Mathematics and Geosciences The Aggregate Performance Indicator takes into account these differences in the research focus of the university and the different characteristics of the subject areas to generate a single metric

  36. AGGREGATE PERFORMANCE INDICATOR Here we can see various metrics including the Aggregate Performance Indicator for selected global institutions. We have see that Harvard produces many more Articles than Princeton and that the average Citations per Article is also higher. However, as discussed in the previous slide, Harvard’s research focuses heavily on highly cited fields such as the Life Sciences which improves the gross citation statistics of Harvard. A more granular approach is required to understand the performance in difference fields. The Aggregate Performance Indicator measure the relative performance and volume of research in each field and generates a single normalized metric which provides a more balanced approach when trying to compare the research performance of institutions.

More Related