1 / 48

Analyzing Usage Statistics of Electronic Resources

Analyzing Usage Statistics of Electronic Resources. Jagdish Arora Director, INFLIBNET Centre. Not everything that counts can be measured. Not everything that can be measured counts. Einstein. Why do we measure Usage?. Usage statistics provide essential evidence:

jerold
Download Presentation

Analyzing Usage Statistics of Electronic Resources

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Analyzing Usage Statistics of Electronic Resources Jagdish Arora Director, INFLIBNET Centre

  2. Not everything that counts can be measured. Not everything that can be measured counts. • Einstein

  3. Why do we measure Usage? Usage statistics provide essential evidence: • for extent of usage of e-resources • to showcase the trends in usage over a period of time, Patterns of usage can be helpful guide for future collection development decisions • to take informed decisions regarding renewal / cancellation of resources • to demonstrate value for money / return on investment

  4. Why Collect Usage Statistics? • To make best and justified use of financial resources • Calculating Return on Investment (RoI) • Accountability • To find out emerging subject disciplines • Reporting usage to administration, funding agencies, etc. • Strategic planning • Comparison with other libraries / institutions

  5. Why Collect Usage Statistics? • Justifications for change in document format • From print to electronics • Lesser number of users visiting library physically • Lesser no. of books being issued / Less re-shelving • Increase in usage of e-resources • Bench Marking • Top-cited journals available in e-format as compare to other libraries • Results of usage of e-resources by existing libraries can serve as a bench mark for other libraries

  6. Why Collect Usage Statistics?: Additional Information • Journals that are used heavily • Journals not being used at all • Number of denials in case of limit of simultaneous usage • Preference for formats: PDF, HTML • Breach of license agreement: Heady downloads or systematic downloads; How to handle it?

  7. Acquiring Usage Statistics • Content Provider (Publishers / Database Vendors) • Some publisher do not provide usage data (eg. MathSciNet, ISID, etc.) • Data inadequate and inconsistent • Data retained on the publisher’s web site only • Inconsistency in usage not reflected • Server caching not reflected

  8. What do libraries want from usage data? • Reliable usage Report in consistent format • Usage at journals titles level • Usage by subject area • Analyse trends over time • Ready access for reporting • Evidence of value for money • Benchmarking (comparative usage)

  9. Adding More Value • Cost-benefit analysis and RoI • Impact of usage on research output • Benchmarking

  10. Why Evaluate at the Consortia Level? • Evaluation is Necessary • Negotiation for renewal • Cost / Benefit analysis • Evaluation is Possible • Relativity • Comparability • Generalizibility

  11. Why Evaluate at the Consortia Level ? • Review of current & prospective contracts • Continuing price escalation not sustainable • Evaluate prices to consortia and members • Review contracts with additional criteria • Promote models for quality not just quantity • Plan for future

  12. Problems with Manual Collection of Usage Statistics • The usage statistics has to be gathered manually from different publishers • Each publisher has • Different formats for data and delivery • Different access methods • Different availability dates • Cost has to be calculated separately • Data needs to be cleaned up and aggregated manually • It is a labor-intensive and cumbersome process prone to data loss and errors

  13. Harvesting Usage Statistics using SUSHI • Automated import of consortia stats • Consortium can track statistics for each member • Data can be retrieved across a series of dates, e.g. period of months • Member logins are pre-populated • The library can access all COUNTER compliant usage stats across their serials holdings • The library can obtain a top level view of their most and least-viewed publishers and titles

  14. Negotiate More Effectively • With COUNTER-compliant costs-per-view in hand, negotiate with publishers to realize more realistic cost models • Uncover previously hidden cost information • Utilize consortium-wide data to negotiate optimal terms for the group as a whole • Obtain a better understanding of our consortium members’ usage patterns and collection needs

  15. INFLIBNET Usage Portal

  16. Benefits of Portal for Usage • Usage statistics for every e-journal package for every member institutions is automatically collected • Consortia-wide data readily available to the whole group for analysis and reporting • The usage data can be exposed completely or partially to member institutions / consortium Administrators

  17. Consortium Usage Analysis

  18. Cost Incurred vs Cost Recovered in 2011

  19. Usage Trend Analysis for a Single Publisher(ACS)

  20. Top Ten Journals of ACS in 2011

  21. Measuring Research Output andImpact of E-Resources

  22. Measuring Research Output • The Science Citation Index (SCI), Social Science Citation Index (SSCI) and Arts and Humanities Index (A&HI) are internationally recognized database that works as a filtering mechanism as it indexes qualitative research output from selected journals. • The source articles appeared in three indices for 50 first-phase universities of the Consortium was searched in blocks of five years from 1975 to 20010 with an aim to compare the research output in the last block year, i.e. 2005 – 2009. • A un-precedental increase in research productivity in terms of number of research articles is evident during 2005-2009 as compared to previous block of five years, i.e. 1975-1979 to 2000-2004.

  23. Increase in no. of articles in past 35 Years (In block of Five Years)

  24. Correlation Usage Vs. Publishing Output Pearson’s Ranks order Coefficient Correlation = 0.75

  25. Usage analysis for A Single Institution

  26. Does this institution need Complete Collection or Selected Subject Collections

  27. No. of Titles Fulfilling the User needs of the Library

  28. Correlation Usage Vs. Publishing Output Banaras Hind University Pearson’s Ranks order Coefficient Correlation = 0.98

  29. Banaras Hindu University Annual Average Growth Rate

  30. Relative Specialization Index Banaras Hindu University

  31. Banaras Hindu University Contribution of BHU as Compared to the World and India’s Total Publications

  32. Publications Output of BHU

  33. BHU’s Citation Impact in Nine Subject Areas

  34. Contribution of BHU to the World’s Most Productive Areas of Research

More Related