1 / 38

Monitoring uptake, impact and value for money in a consortium: usage data and beyond

Monitoring uptake, impact and value for money in a consortium: usage data and beyond. John Cox Deputy Librarian National University of Ireland, Galway. IReL in Brief. Established in 2004 Government-funded Focused on research community Accessible at 7 Irish universities

Download Presentation

Monitoring uptake, impact and value for money in a consortium: usage data and beyond

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Monitoring uptake, impact and value for money in a consortium: usage data and beyond John Cox Deputy Librarian National University of Ireland, Galway

  2. IReL in Brief • Established in 2004 • Government-funded • Focused on research community • Accessible at 7 Irish universities • Covers most disciplines • c. 90 “products” • 25000+ e-journals • 40000+ e-books

  3. IReL Content: STM

  4. IReL Content: AHSS

  5. IReL Management

  6. Monitoring Group: remit • Collate and monitor performance statistics in relation to the value for money of IReL titles. • Collate and monitor downtime of IReL titles. • Suggest retention or cancellation of IReL titles based on information gathered. • Provide summaries of changes of content in IReL major services i.e. deletions of titles or addition of new titles. • Note deficiencies of IReL information supply with regard to specific areas of research. • Suggest ways of continuing to promote the IReL service.

  7. Monitoring Group: members • Rosarii Buttimer, University College Cork • John Cox (chair), National University of Ireland, Galway • Aoife Geraghty, University of Limerick • Arlene Healy, Trinity College Dublin • Jack Hyland, Dublin City University • Fiona McGoldrick, IRIS • Niall McSweeney, National University of Ireland, Galway • Claire Moran, University College Dublin • Val Payne, National University of Ireland, Maynooth

  8. Activities • Downtime register • Usage statistics • User survey • Ongoing interaction with: • Steering Group • Users • Vendors

  9. Downtime (known)

  10. Usage Statistics • Excel templates for e-journals, databases • Basic quantitative indicators of uptake and value • Number of downloads/searches • Cost per download/search • Top 10 journals per resource according to downloads • Number and % of journals per download “band” • Turnaways • Annual frequency, with trend reports

  11. E-Journal Example

  12. Database Example

  13. Annual Report of IReL Usage • Mix of figures and commentary • Summary table of download volumes and costs • Most downloaded journals overall • Analysis by download band • Usage by type of resource • Trends by discipline • Comparison with earlier year(s)

  14. Value for Money

  15. Top STM Journals

  16. Consolidation of Usage

  17. Low-Usage Titles

  18. Trends of interest (STM only) • Strong uptake in nursing, chemistry • Journal of Advanced Nursing has most downloads • Cost per download compares very favourably to ILL • Usage tends to increase over time • Significant % of journals with <50 downloads • Lower usage, higher costs for non-journal resources

  19. Compilation Difficulties • Labour-intensive • Mix of COUNTER/non-COUNTER data • Costs – need to factor in: • IReL /local payments initially • “maintained spend” • VAT • Total consortium figures • Some vendors slow to respond • Timing, eg synchronisation with subscription decisions • Unanswered questions, eg impact, quality, satisfaction?

  20. 2007 IReL Impact Survey • Given priority over 2006 usage stats compilation • Essential complement to statistical data • Pre-consultation with researchers and funders • Focused distinctively on: • Value to researchers • Purpose of use • Impact on work • Satisfaction with coverage • (Recognised) use of IReL resources • Role of print • Access

  21. Who Participated? • 2266 researchers in all disciplines • Staff • PhDs • Research Masters • 7 institutions

  22. Survey Questionnaire

  23. Findings of Note • IReL includes 75% of researchers’ “top 5” journals • But… gaps include journal backfiles, newspaper archives • Significant access (eg off-campus) and discovery issues • Lack of association with IReL • 55% don’t need print copies of IReL journals http://www.library.nuigalway.ie/resources/irelsurvey07.pdf

  24. Use of IReL

  25. How IReL Benefits Research • Speed • Ease of online access • Coverage, including multidisciplinary • Currency • Stronger competitiveness • Easier collaboration • New areas of research now possible

  26. How IReL Benefits Teaching • Faster transfer of ideas to lecture hall • Integration of online journals in Blackboard • Easier access to course readings • Wider choice of sources • Updated teaching materials

  27. IReL is a Luxury, not a Necessity Discontinuation = “disaster”, “Dark Ages”, “would leave”

  28. Blackwell Synergy: AHSS users

  29. Survey experience • Labour-intensive • Seemed to engage senior stakeholders more than stats • Good on impact, quality of experience • Influential in likely continuation of IReL funding • Helpful in identifying specific gaps in coverage • But important to correlate findings with stats

  30. Different Messages?

  31. Measuring Uptake and Impact • Usage data and user survey complementary • Stats • valuable indicators of activity • identifiers of uptake and value for money • guidance on subscription decisions • what and when? • Survey: • actual user experience • satisfaction levels • impact, return on investment • who, how and why?

  32. IReL Usage Data: development • Zero use titles? • Correlation with impact factors? • Outputs, eg researcher publications • Cost per student/staff FTE? • Multi-platform access/federated searching • E-book data

  33. E-Books: What is a Section? “Some publishers make online books available only as a single file that can be downloaded in its entirety, with no further vendor monitoring of usage being possible. Other publishers allow the downloading of individual chapters or entries, such as dictionary definitions or chemical structures.” Shepherd, P. T. (2006). The COUNTER Code of Practice for books and reference works. Serials, 19(1), 23-27.

  34. E-Books: What is a Section? “For Oxford Reference Online ‘full-content units requested’ correspond to the number of discrete entries viewed in its constituent reference works, but five print pages viewed online is the measure for Oxford Scholarship Online. Safari sections typically correspond to three pages of the printed book, while there is no differentiation by section in NetLibrary which simply counts “accesses” to a title.” Cox, J. (in press). Making sense of e-book usage data. The Acquisitions Librarian.

  35. E-Books: Access Restrictions “DRM is more complex for e-books than e-journals due to publishers’ concerns about possible loss of print revenue for academic textbooks especially..…. Restrictions such as allowing only the printing of one page at a time (eg NetLibrary), not supporting downloading (eg ACLS Humanities E-Books) or disallowing copying and pasting (eg informaworld) inevitably impact on user behavior. Usage reporting…. only reflects the extent of actual permitted use rather than full potential activity. DRM significantly affects opportunities for comparing usage across e-book platforms.” Cox, J. (in press). Making sense of e-book usage data. The Acquisitions Librarian.

  36. E-Books: Who to Contact? “The one thing that really struck me from conversations was that publishers had no single e-books department …. Several departments had to be consulted, such as textbooks, sales, those in charge of platforms, finance, rights and so on.” Milloy, C. (2007). E-books: setting up the national observatory project. Library and Information Update, 6(11), 32-33.

  37. E-Resources: Who to Contact? “In addition, vendors often provide little or no information about how the data were collected, and it can be difficult and time consuming to find a vendor customer service or technical representative who knows exactly how to interpret the data.” Duy, J., & Vaughan, L. (2003). Usage data for electronic resources: a comparison between locally collected and vendor-provided statistics. Journal of Academic Librarianship, 29(1), 16-22.

  38. Usage Statistics: Future Hope • COUNTER • Journals/Databases code of practice: 80+ vendors, 10000+ titles • Books/Reference Works code of practice, 2006: 10 vendors • http://www.projectcounter.org/compliantvendors.html • SUSHI • Standardized Usage Statistics Harvesting Initiative • Automatic downloading of usage stats • http://www.niso.org/committees/SUSHI/SUSHI_comm.html Hendricks, A. (2007). SUSHI, not just a tasty lunch anymore: the development of the NISO Committee SU's SUSHI standard Library Hi Tech, 25(3), 422-429.

More Related