1 / 37

Scholarly impact Metrics an overview

Scholarly impact Metrics an overview. Johan Bollen – jbollen@indiana.edu Indiana university School of informatics and computing Center for Complex networks and systems research. Science: ideas not bricks. Science and scholarly communication matters.

xanti
Download Presentation

Scholarly impact Metrics an overview

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Scholarly impactMetricsan overview Johan Bollen – jbollen@indiana.edu Indiana university School of informatics and computing Center for Complex networks and systems research OAI8 - June 2013

  2. OAI8 - June 2013

  3. Science: ideas not bricks • Science and scholarly communication matters. • Economic and cultural value is enormous, and rests on considerable investments of • Capital • Infrastructure • Human resources • Education • Outcomes: ideas and information • Not the amount of paper pulp produced, number of bricks laid, metal forged, tractors built, fields plowed • It’s largely about the ideas and how they are communicated, BUT: • Not all ideas matter equally • Not all ideas should be communicated OAI8 - June 2013

  4. Science as a GIFT economy • Gift economy: • services and good are shared freely without implicit or explicit expectation/agreement of reciprocation • “economy of abundance, not scarcity” • found in some societies • Science is a little like that: • information is shared as freely as possible through publications • information is perishable (half-life of good idea) • reward for sharing is essentially a social phenomenon: “esteem”, “prestige”, “influence” OAI8 - June 2013

  5. Impact~ Publication • Scholarly outcomes and ideas are traditionally perceived to be mainly shared through the peer reviewed literature, aka publications • An entire industry has emerged to support this modus operandi • Not universal, has not always been that way, might not always be this way, but presently dominant • Our ideas of scholarly impact is now strongly tied to scholarly publications • Ideas that impact or influence fellow scholars reach them via peer-reviewed publications • Influence and impact is thus expected to be expressed through the medium of peer-reviewed publications • -> Citation data has become de facto currency of impact or influence: • When one scholar cites the work of another, this is deemed recognition of their influence • Measuring impact from citations OAI8 - June 2013

  6. Citation data OAI8 - June 2013

  7. Citation networks The map equation M. Rosvall, D. Axelsson, and C.T. Bergstrom, European Journal of Physics, 178, 13–23 (2009) Maps of random walks on complex networks reveal community structure Martin Rosvall*,† and Carl T. Bergstrom*, PNAS 105(4), 1118-1123 OAI8 - June 2013

  8. Journal x All (2003) 2001 2002 2003 From Citation data to journal impact FACTOR Impact Factor = mean 2 year citation rate OAI8 - June 2013

  9. That concludes this lecture • Thank you for your undivided attention. OAI8 - June 2013

  10. Hold on • It’s just not that simple OAI8 - June 2013

  11. A few things left to discuss… OAI8 - June 2013

  12. “The map is not the territory” • Impact, influence is a social phenomenon • It already exists in the scholarly community • Most scholars already have a notion of which ideas, publications, journals, and authors matter the most • To measure this social construct of scholarly impact we can choose many different “operationalizations”/measurements: • Ask scientists: surveys, questionnaires, awards • Correlates: funding decisions, publication data, citation data • “Behavioral” data: readership, ILL, reshelving download data, Twitter mentions, etc. OAI8 - June 2013

  13. many permutations • Data type and which community it represents • Citation data: authors • Usage data: authors, readers, public • Social media data: everyone • 2. Type of metric calculated from (1) • Counts, normalized counts • Social network metrics • Trend metrics • 3. Level of granularity: • Entities: authors, journals, articles, teams, countries • Time: 5-year span, 2 year span, etc. OAI8 - June 2013

  14. Metrics, cubed Metric type counts Social network trends journal citation arricle author Data type usage Granularity Social media OAI8 - June 2013

  15. back to Citation data and networks Johan Bollen, Herbert Van de Sompel and Marko A. Rodriguez. Towards usage-based impact metrics: first results from the MESUR project, JCDL 2008, Pittsburgh, PA, June 2008. (arXiv:0804.3791v1)

  16. Citation-based metrics • Author-level metrics: • Total citations • H-index: • Nth publication with at least n citations (rank order pubs by decr. Cites) • g-index, e-index, a-index • Co-author network indicators • Article level metrics: • Total citations • Normalized citation counts • Journal level: • Impact factor • SNIP, Crown indicator • Social network metrics from citation network (next slide: PageRank, Eigenfactor, Y-factor, betweenness, etc) Hirsch (2005) PNAS 102(46) 16569-16572 Radicchi et al . (2008) PNAS 105(45) 17268-17272 OAI8 - June 2013

  17. INNOVATION I : Citation-based social network metrics • Degree • In-degree • Out-degree • Random walk • PageRank • Eigenvector • Shortest path • Closeness • Betweenness

  18. Social network analysis OAI8 - June 2013

  19. PAGERANK FOR JOURNALS 2003 JCR, Science Edition 5709 journals, L=0.85 Pinski, G., & Narin, F. (1976). Citation influence for journal aggregates of scientific publications: theory, with application to the literature of physics. Information processing and management, 12(5), 297-312. Chen, P., Xie, H., Maslov, S., & Redner, S. (2007). Finding scientific gems with Google. Journal of Informetrics, 1(1), arxiv.org/abs/physics/0604130.

  20. PAGERANK FOR JOURNALS Popularity vs. prestige Outliers reveal differences in aspects of “status” IF ~ general popularity PR ~ prestige, influence Johan Bollen, Marko A. Rodriguez, and Herbert Van deSompel. Journal status. Scientometrics, 69(3), December 2006 (DOI: 10.1007/s11192-006-0176-z) Philip Ball. Prestige is factored into journal ratings. Nature 439, 770-771, February 2006 (doi:10.1038/439770a)

  21. INNOVATION II: “behavioral” data • Scholarly community and communication is moving online. • Data pertaining to online activities (implicit, behavioral) vs. citation data (explicit declaration of influence) Bibliographic data Scholarly community metrics Citation Behavioral data Scholarly communication items OAI8 - June 2013

  22. behavioral DATA • Reading/usage statistics • Interlibrary loan data • Reshelving data • Online catalogue systems • Daily, weekly, monthly access or reading statistics • Usage data: • Web server logs • Link resolver data (SFX, etc) • Detailed data on “who”, “what”, “where”, “when”: ability to track scholarly activity in real-time OAI8 - June 2013

  23. Usage statistics • COUNTER: member organization defining an auditable standard for reporting and aggregating monthly usage statistics (www.projectcounter.org) • Journal and article level • Initiative to define “usage factor” • PLoS Article Level Metrics • Download numbers • download trends OAI8 - June 2013

  24. MESUR • Andrew W. Mellon and NSF funded project at LANL Digital Library Research and Prototyping and Indiana University • Very large-scale usage data from publishers, aggregators, and library consortia • Metrics of scholarly impact derived from aggregated usage data • Mapping scientific activity from log clickstream data • Examine“scholarly impact” itself (more later!) • Notable distinction: use of log data that contains clickstream enables metrics and analysis beyond level of usage statistics • Presently concluding planning process (Andrew W. Mellon funded) to evolve to community-supported, sustainable entity OAI8 - June 2013

  25. Innovation III: Alt-metrics • Behavioral AND “attention” data. • Social media attention, bookmarking,mentions • Attempt to also capture “social” attention or public impact of scholarly work (not just articles!), another possible dimension of impact OAI8 - June 2013

  26. Some relevant research Eysenbach G (2011) Can tweets predict citations? Metrics of social impact based on twitter and correlation with traditional metrics of scientific impact. Journal of Medical Internet Research 13: e123. Shuai X, Pepe A, Bollen J (2012) How the Scientific Community Reacts to Newly Submitted Preprints: Article Downloads, Twitter Mentions, and Citations. PLoS ONE 7(11): e47523. doi:10.1371/journal.pone.0047523 OAI8 - June 2013

  27. Twitter mentions ~ downloads, citations? OAI8 - June 2013

  28. Twitter mentions correlate with downloads and citations! OAI8 - June 2013

  29. Alt-metrics as part of impact assessment OAI8 - June 2013

  30. Citation Data, metrics, impact, alt-metrics, usage data, let’s step back for a second OAI8 - June 2013

  31. Blind map-makers • Odd, nearly tautological situation: • We have many different metrics or ways to measure impact. • But no formal or consistent definition of scholarly impact. • No idea of what exactly impact is, how it manifests itself, what its structure is, along which dimensions it varies. etc • Whether our metrics actually measure or represent impact • Our metrics ARE the definition of “impact” OAI8 - June 2013

  32. Scholarly impact Metric 1 Validity & Reliability Metric 2 Some form of impact impact Metric 4 Not quite impact Metric 6 Not impact Metric 5 Metric 3 OAI8 - June 2013

  33. Mapping out impact, one metric at a time • Bollen J, Van de Sompel H, Hagberg A, Chute R (2009) A Principal Component Analysis of 39 Scientific Impact Measures. PLoS ONE 4(6): e6022. doi:10.1371/journal.pone.0006022 • Priem at al. Altmetrics in the wild. • Thelwall M, Haustein S, Larivière V, Sugimoto CR (2013) Do Altmetrics Work? Twitter and Ten Other Social Web Services. PLoS ONE 8(5): e64841. doi:10.1371/journal.pone.0064841 • PLoS ONE alt-metrics correlations: investigated by L Juhl Jensen, Novo Nordisk Foundation • Bornmann, L., Mutz, R., & Daniel, H.-D. (2008). …A comparison of nine different variants of the h index using data from biomedicine. JASIST, 59(5), 830-837. OAI8 - June 2013

  34. Finally… WHY? • Just like social status, scholarly impact (or other) is an interesting scientific study area. It emerges from the scholarly communication process. • BUT pure science is clearly not the only motivation: • Metrics used in assessment • Decision-making: funding, promotion, … • Information filtering • Some of these applications are tremendously useful and potentially enabling of radical changes in scholarly communication, e.g. information filtering and assessing broader community impact of scholarly work. OAI8 - June 2013

  35. However… • Assuming that scholarly impact exists, independently of whether we measure it or not: • Why measure it at all in cases where the scholarly community truly has decision-making power, autonomy? Isn’t the latter a more desirable option than administrators, politicians, and bureaucrats making decisions on the basis of numbers they don’t understand? • So buy me a beer and ask me about our crazy crowd-sourced funding idea… • Johan Bollen, David Crandall, Damion Junk, Ying Ding, Katy Boerner. Collective allocation of science funding: from funding agencies to scientific agency. http://arxiv.org/abs/1304.1067 OAI8 - June 2013

  36. Thank you OAI8 - June 2013

More Related