1 / 30

Usage Data for Electronic Resources WRAPS/FRIP Presentation April 24, 2007 Gayle Baker, Maribeth Manoff, Eleanor Read

Usage Data for Electronic Resources WRAPS/FRIP Presentation April 24, 2007 Gayle Baker, Maribeth Manoff, Eleanor Read. MaxData http://web.utk.edu/~tenopir/imls/index.htm. “Maximizing Library Investments in Digital Collections Through Better Data Gathering and Analysis”

arlen
Download Presentation

Usage Data for Electronic Resources WRAPS/FRIP Presentation April 24, 2007 Gayle Baker, Maribeth Manoff, Eleanor Read

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Usage Data for Electronic Resources WRAPS/FRIP Presentation April 24, 2007 Gayle Baker, Maribeth Manoff, Eleanor Read

  2. MaxDatahttp://web.utk.edu/~tenopir/imls/index.htm “Maximizing Library Investments in Digital Collections Through Better Data Gathering and Analysis” Funded by Institute of Museum and Library Services (IMLS) 2005-2007

  3. MaxData Project Purpose • Evaluate and compare methods of usage data collection and analysis • Develop cost/benefit model to help librarians select appropriate method(s) for electronic resource usage assessments

  4. MaxData Project Teams • UT Libraries: COUNTER data from vendors, link resolver, database usage logs, federated search engine • David Nicholas et al. (Ciber): deep log analysis on OhioLINK journal usage data • Carol Tenopir and Donald King: readership surveys at UT and four Ohio universities

  5. FRIP Equipment Award(Fall 2005) • Requested • PC with extra capacity for handling data • HP LaserJet Printer • Microsoft Office 2003 Professional • Archival DVDs • $2477 • Consulted with David Ratledge • Housed in faculty study in Hodges

  6. Project File Sharing • Account (Usestat) on library server for project files for UT Libraries team • BlackBoard group site for MaxData team

  7. Presentations • Charleston 2005 (GB, ER/project intro) • ER&L 2006 (GB/vendor data issues) • Lib Assessment 2006 (ER, MM/combining data) • Charleston 2006 (GB/vendor data results) • ER&L 2007 (GB/vendor data survey) • ELUNA 2007 (MM/SFX data) • ALA/ACRL/EBSS 2007 (MM/data presentation) • Charleston 2007 (all 3/comparing data types)

  8. Publications • “MaxData: A Project to Help Librarians Maximize E-Journal Usage Data.” In Usage Statistics of E-Serials (summer 2007) • “All That Data: Finding Useful and Practical Ways to Combine Electronic Resource Usage Data from Multiple Sources.” Library Assessment Conference Proceedings (May 2007) • Article on vendor data survey results in Learned Publishing (due June 1, 2007)

  9. The Usage Data Challenge • Vendor-supplied data • Other data

  10. Vendor Reports: Background • Vendor-supplied data primary source of e-journal usage information • Project COUNTER helpful, but… • Manipulation may be required to compare use among vendors

  11. Vendor Reports: Consolidating • COUNTER Journal Report 1 (JR-1) • Data from each vendor combined in Excel spreadsheet • Facilitates additional analyses • Sorting by selected fields • Subject analysis • Cost per use calculations

  12. COUNTER: JR1 Format

  13. Vendor Reports: Challenges • Inconsistencies in data fields • Journal title (articles, upper/lower case, extra information) • ISSN (with and without hyphen) • Time consuming to fix • ScholarlyStats, SUSHI, ERMS may help

  14. Survey: Purpose • How much effort is involved in working with vendor-supplied use data? • How are the data used? • What data are most useful in managing electronic resources?

  15. Survey: Subjects • Sent to Library Directors at Carnegie I and II research institutions (360+) • April 2006 • 92 respondents

  16. Number of Vendors Providing Usage Reports

  17. Reports for Different Types of Resources

  18. Purpose for Reviewing and/or Analyzing Vendor Data

  19. Number of Hours Processing Usage Reports in 2005

  20. Percentage of Time Processing Vendor Data

  21. Biggest Challenges • Lack of consistency / standards (61) • Takes too much time (27) • COUNTER standards help but… (14)

  22. Most Useful Statistic(s) • Number of full-text downloads (67) • Number of searches (41) • Number of sessions (27) • COUNTER statistics (26) • Number of turnaways (17) • Other (17)

  23. Other (Local) Data • UT – database “hits” recorded from database menu pages • Federated search system (MetaLib) statistics • Some libraries using proxy server logs • Link resolver (SFX) data

  24. Link Resolver Data • SFX includes a statistical module with a number of “canned” reports • For journal level data, one report in particular (“Requests and clickthroughs by journal and target”) is analogous to COUNTER JR1

  25. SFX “Request” and “Clickthrough” Data • UT student searching in an SFX “source” discovers an article of interest • Clicks on FindText button • Article is available electronically in Journal A, Package Y and Z – “Request” statistic recorded for each • Student chooses link to Journal A in Package Y – “Clickthrough” statistic recorded

  26. SFX “Clickthroughs” vs. JR1 “Full-Text Article Requests” • Clickthrough is less specific, does not measure actual download • But, clickthrough is a “known quantity,” not dependent on package interface • SFX report as a useful supplement to JR1, comparing trends and patterns • SFX contains data not in JR1 reports, e.g., non-COUNTER packages, open access journals, backfiles

  27. Formatting the SFX Report • Report from SFX is not formatted like JR1, does contain data elements • Request to software vendor: Include in statistical module • Incorporate into ERMS • Manual or programming approach, depending on time and expertise available

  28. Other Useful Link Resolver Reports and Data • Unmet user needs • Journals “requested” with no electronic full-text available • Interlibrary loan requests • Unused full-text report • Overlap reports • Subject categories

  29. Conclusions So Far • Collecting, consolidating and analyzing vendor data is time-consuming and difficult • Survey of electronic resource librarians indicates many do not have enough time • Acquiring data from local systems provides consistency, also requires time and effort • Libraries face difficult decisions about what methods are most practical and useful

  30. Into the Future • Present selected data sets to subject librarians to see what they find useful • Investigate usefulness of new COUNTER standards • Will SUSHI solve our problems? ERMS? • Compare our findings with those of the other MaxData teams

More Related