1 / 15

Evaluating the Value of Next Generation Databases Arnaud Delivet Regional Sales Manager

Evaluating the Value of Next Generation Databases Arnaud Delivet Regional Sales Manager. Usage statistics and determining value. What we will cover Why it’s important to be considering this question What we collect and analyze now, and why Other measurements the industry could consider

seoras
Download Presentation

Evaluating the Value of Next Generation Databases Arnaud Delivet Regional Sales Manager

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating the Value of Next Generation Databases Arnaud Delivet Regional Sales Manager

  2. Usage statistics and determining value What we will cover • Why it’s important to be considering this question • What we collect and analyze now, and why • Other measurements the industry could consider • Your thoughts on future directions in usage reporting

  3. Why it’s important to re-consider usage Understanding usage is critical • Assists in the evaluation of the investment in electronic resources • Assists in the evaluation of the satisfaction of users • Assists in the assessment of the value of the library Usage of databases is changing • The user is evolving • Search engines are improving • Volume of content increasing exponentially • Other online resources in heavy competition with research databases Evaluation of databases should evolve to meet this changing landscape

  4. What information are we tracking today? Driven largely by COUNTER Requirements • Number of sessions • Number of searches • Number of downloaded/viewed citations/abstracts • Number of downloaded/viewed full text documents or equivalent • Number of federated searches Data on which databases and which journals • Reporting of usage by database • Reporting of usage by publication Other useful measurements • Searches by search mode (basic, advanced, command line, etc.) • Searches by time of day • Document usage by format (Abstract, Full text HTML, Preview, etc.)

  5. What is the value of COUNTER? • COUNTER provides a standardized way of looking at the data • It is useful to track basic information in aggregate and monitor thresholds of usage • It gives you an apples-to-apples comparison

  6. Need to consider user behavior • Users enter in fewer search terms • expect to filter later • Users are interacting differently with content • tagging/commenting on content • sharing content • creating their own content (i.e. clips of video) • This means users are searching less • Potentially means fewer searches and sessions logged • Fewer full text document accesses as well

  7. The new questions • Does what we are measuring today provide enough insight into the value of databases? • Did the user have a positive or negative interaction with the database? • Would the user come back to this resource again? • Would the user recommend this database to a friend?

  8. Other considerations on database evolution • Increased use of PDFs • New types of content • Rich media (images, maps, audio, video, other interactive content) • More browse capabilities • More post-search manipulation/filtering • Ability to tag content/make comments • Ability to store citations in personal workspace/folder

  9. So what else do we really want to measure? • User satisfaction and success • Are users able to accomplish tasks efficiently? • Is the research experience engaging? (essential to holding the attention of users) • Did they find what they need? Speaks to relevance. • Do they come back to this database time after time for their research? Speaks to usability. • Does the user consider this a “primary or go-to” resource? • Do perceptions change for different users?

  10. So what else do we really want to measure? • Engagement v. Efficiency • The “quality time” spent on the site vs. how efficiently did the user get information? • Conversions • What percentage of users performed a desired task? What is that task (could refer to the number of alerts, RSS feeds, saved searches, type of retrieval) • Attribution • Are users responsible for getting other users to the database content? Are they using social media to do this? Who should get credit for this referral (the referring user, the search engine to get to the database, or the social media platform that enabled the referral)?

  11. Are there new value indicators to consider? • Time spent browsing/interacting with site – ‘stickiness’ • How long are we holding the user’s attention? • Number of times article tagged / lists created • How are the user’s interacting with content? • Number of alerts, RSS feeds, saved searches, etc. • Indicator of value derived from database and content included • Usage of references • Going deeper into the content demonstrates value

  12. Are there new value indicators to consider? • Amount of traffic into site via links • Are users pointing other users to database content? • Amount of traffic out of site via links (A&I) • User finds document and wants the full-text • Number of times users employ navigation filters, etc. • Can indicate success of search query, value of post search tools • Number of ‘My Research” accounts • How frequently they’re used, and for what • Who is using them – undergraduate students, faculty, etc.

  13. Are there user indicators to track? • Who are my users? • What is their role? • Where are users coming from? • iPhone, iPad, other mobile devices? • Dorm room, library, off campus? • Federated search and other discovery tools? • Where do users go after they get information? What is their next step?

  14. For your consideration • What other value indicators should we be considering? • How do we measure and analyze them? • Where does better/different usage reporting stand in the grand prioritization of database features? • How do we move from measuring, to providing intelligence that drives positive action?

  15. Thank You arnaud.delivet@proquest.co.uk

More Related