1 / 42

Usability of Next-gen interfaces & Discovery Tools

Usability of Next-gen interfaces & Discovery Tools. Rice Majors Faculty Director of Libraries Information Technology University of Colorado, Boulder Innovative Users Group * April 2011 * San Francisco. Agenda. Why I did this study Study design & methodology Quibbles Inadvertent findings

consuela
Download Presentation

Usability of Next-gen interfaces & Discovery Tools

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Usability of Next-gen interfaces & Discovery Tools Rice Majors Faculty Director of Libraries Information Technology University of Colorado, Boulder Innovative Users Group * April 2011 * San Francisco

  2. Agenda • Why I did this study • Study design & methodology • Quibbles • Inadvertent findings • Preliminary (deliberate) findings

  3. Why I did this study

  4. Confluence of reasons • Software companies lack access/resources for consistent & comparative studies • Many libraries lack access/resources to carry out studies (especially comparative) • Working on a campus, I have relatively easy access to study participants • Pursuing this research let us justify acquiring usability resources we can use for other purposes • Tenure requirements

  5. Study design and methodology

  6. Study design & methodology • Task based testing with undergraduate students • One hour; hence four tasks • Many features went untested • User defined success • Usability software (Morae) • Video capture of actions taken • Video/audio capture of “thinking aloud” • Survey instrument • Tested against five discovery tools / next-gens

  7. Partner libraries • James Madison University (EBSCO Discovery Service) • University of Colorado-Boulder (Encore Synergy) • Vanderbilt University (Primo Central) • Arizona State University (Summon) • Auraria Library, Denver (WorldCat Local)

  8. Typical participant schedule • 10 minutes of intake (IRB requirements) • 25-40 minutes of task completion • 10 minutes of outtake (survey instrument & an opportunity to ask further questions)

  9. Tasks 1 & 2 : browsing/discovery, sharing, & peer-reviewed • You have a group project on water quality in [state]. Find three books that might be good for this project. Email information about these books to your group so they can find the books • You have a group project on political campaign financing. Find three articles that might be good for this project, at least two of which must be peer-reviewed. Email your group so they can find/read the articles.

  10. Task 3 : facets/filtering & incremental discovery • Find all recordings that the library owns by The Beatles. Somehow remind yourself to look at these again later.

  11. Task 4 : Just get it for me already • The library doesn’t own the book “[title].” Have the library get this book for you.

  12. Survey Instrument 1 • Demographic data: year, school, major • Likert items for: • The tasks in this study were easy to understand • The tasks in this study were easy to complete • I was able to find what I need for these tasks using this discovery platform • If I were doing my own research, I would be able to find what I needed using this discovery platform • The discovery platform I used today is easy to use • Prior to this test, I had used library catalogs quite a bit • Prior to this test, I had used the libraries at CU quite a bit

  13. Survey Instrument 2 • Short answer questions: • What is easy to use about this platform? • What is hard to use about this platform? • What one change would make the biggest improvement to this discovery platform?

  14. Quibbles Mine and yours

  15. Quibbles with study design 1 • User defined success was plausibly a great idea • Definition of success different in a study vs. real life • Some subjects very uncomfortable with ambiguity • Inescapable testing of information literacy • Students are contaminated (but this might be ok?) • Already know Encore Synergy (maybe) • Already know Ebsco’s look & feel (maybe) • Already know INN-Reach – Prospector (our INN-Reach system) is the Meow Mix™ of Colorado

  16. Quibbles with study design 2 • Millennium ILS (except Primo Central) • Library implementation choices • Who knows? Order/design of facets, presence/absence of other features…? • Classic catalog implementation • Library website design choices • Help features (ask a librarian, chat-based help) • Prominence of other features (access to INN-Reach, access to ILL, access to purchase suggestion)

  17. Inadvertent findings Jargon, library practices, & behavior

  18. Inadvertent findings: jargon & library practices • Students do know what “inter-library loan” is • Students do not know what “government publications” are • Students do not know what “electronic resources” are • Existential musings about “book” qua “book” • Students want FRBR solutions (maybe) • New students do not know the clever nickname you’ve given your catalog

  19. Inadvertent findings: participant behavior • Students will type anything into the search box • Not at all clear what is indexed / available • “Inter-library loan” • “Help” • “Chat with librarian” • Students will fumble around looking for the kind of feature they want (maybe) • Tasks may inescapably lead participants to believe features will exist?

  20. Preliminary Findings Some data is easier to analyze than others, so this is an update on the findings so far, mostly from the survey instrument. Further data should be available in June

  21. Demographics

  22. Demographics

  23. The tasks in this study were easy to understand

  24. Perceptions of ease of use Because these data don’t tell a complete story, and the complementary data is not yet analyzed, product names are not included

  25. Short answer replies Two of the three questions asked for constructive criticism, so the data are skewed toward the “constructive”

  26. Features appreciated • Finding different types of resources & narrowing search (12) • Ability to email books & articles (7) • Found relevant articles/results (2) • Ability to save records for later • Smooth interface to library website • Simple interface • Integration with INN-Reach • Clearly labeling peer-reviewed articles

  27. Features not found / non-existent 1 • Hard to find ILL (8) • Add more refining/search options (7) & put near the search box (2) & clarify what they do (2) • More options without leaving browse (2) • Facet behavior: • Allow multiple options per facet • Have results update automatically as you select facets • Add shortcuts for common tasks • Add spell checking to search box

  28. Features not found / non-existent 2 • Saving results: • Removing items from saved folder is hard • Add a Save All Results function • (Couldn’t find way to save records for later) • Make purpose(s) of saved lists clearer / easier to use • Email: • Remove word-typing check for email (3) • Hard to find email option • Allow emailing to multiple addresses • Not clear what is being emailed

  29. More feedback • Finding books is hard • Too many detours • Messy, unattractive layout • “Help” link not very helpful • (Perception that only EBSCO dbs included) • (Hard to find Ask a Librarian) • Article metadata is too thin to make appropriate decisions • Make articles layout match books more closely, increase readability (2) • Make it clearer what journals vs. articles are • Lots of broken links trying to find articles

  30. Task achievement for tasks 3&4 • Using email feature, sent some/all records (8) • Created account & saved search/records (6) • Pasted bib data / URL into own email (5) • Would write a note (2) • Saved records to buffer (3) • Forgot to remind self (3) • Gave up • Requested via INN-Reach (10) • ILL / library webpage (8) • Request purchase form • Sought help online (4) • Would seek help in person (2) • Gave up (3)

  31. What’s next?

  32. Next steps for data analysis • Measure task success & time to complete tasks • Compare task success to perception of ease of use • Transcribe “thinking aloud” comments and correlate to other qualitative data • What features did participants expect to be present? • What features were participants unable to find?

  33. Next steps for sharing results • ALA Annual (LITA track) in New Orleans • Presentation of further findings • Including “highlights” video footage to demonstrate findings • Article for submission to special issue of Library Trends

  34. Thank you!Rice.majors@colorado.eduALA Annual in New OrleansSunday, 26 June 2011 @ 10:30am Rice Majors Faculty Director of Libraries Information Technology University of Colorado, Boulder Innovative Users Group * April 2011 * San Francisco

More Related