1 / 84

Usability testing for library catalogs

Usability testing for library catalogs. October 25, 2001 Nicole Hennig, Web Manager libraries.mit.edu libraries.mit.edu/barton. Thank you. Tracy Gabridge Librarian for Civil & Environmental Engineering • led the HTML customization team. Details available . http://macfadden.mit.edu:9500/

nerita
Download Presentation

Usability testing for library catalogs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Usability testing for library catalogs October 25, 2001 Nicole Hennig, Web Manager libraries.mit.edu libraries.mit.edu/barton

  2. Thank you Tracy Gabridge Librarian for Civil & Environmental Engineering • led the HTML customization team

  3. Details available ... http://macfadden.mit.edu:9500/ webgroup/usability2001/barton/test1/ overview.html

  4. Outline 1. background 2. the tests 3. problems & solutions 4. future directions

  5. 1. Background

  6. 6 month process January - June 2001 • old system: GEAC Advance • new system: ExLibris ALEPH

  7. Web OPAC project teams • web OPAC team - public service librarians - circulation staff - processing staff - cataloger - web manager

  8. Web OPAC project teams • HTML customization team same as previous, plus - systems office staff - programmer

  9. Bibliography • on handout • includes background on display and interface design of library catalogs

  10. Background research • a lot of research on OPAC design available • but not based on observing users or usability testing • library system vendors are not following basic good design principles

  11. Who makes design decisions? • we have more control now that we can customize HTML screens • the vendors need to practice good design in building the system

  12. A work in progress • libraries.mit.edu/barton • more rounds of testing and improvements are coming later in the spring

  13. Usable design goals • every page is self-explanatory • “self-teaching” interfaces

  14. Will it apply? • some things are specific to ExLibris systems • many things are general - could apply to any OPAC

  15. General principles • success summary http://macfadden.mit.edu:9500/ webgroup/usability2001/barton/ test2/success.html

  16. 2. The tests

  17. The test • we had already done extensive usability testing while redesigning our web site

  18. Latest thinking has changed • 1999: Large test, 30 users, timed people - quantitative • 2001: - More frequent, smaller tests, 5-6 people at a time - qualitative

  19. The test • 1/2 hour long • 10 questions • think out loud

  20. The test • observer takes detailed notes • train observers to not answer how it was supposed to work until end of the test • each observer tests 2 people (2 week time frame)

  21. Designing questions • easy, basic tasks that a first-time user should be able to accomplish • real-world tasks (give them a real article citation)

  22. Designing the questions • no need to obsess over perfect, “scientific” questions • you will learn plenty from watching people use the catalog

  23. The questions • 1 - 5: known items • 6 - 10: general research • complete list: http://macfadden.mit.edu:9500/webgroup/ usability2001/barton/test1/questions.html

  24. The questions • test the questions • get the bugs out • print out the questions in large type

  25. Who we learned from • Washington State University Janet Chisman, et al. “Usability Testing: A Case Study” College & Research Libraries Nov. 1999

  26. What we learned • multi-part questions - if user can’t complete first part, observer does it so they can try second part

  27. What we looked for • features that were confusing or unclear • aspects of the system that worked well

  28. The tests test 1 test 2 Who 7 students 3 students 3 library staff 4 library staff 4 disabled Catalogs our old web catalog:Barton (6) 1st draft of McGill: MUSE (2) new Barton Boston College: QUEST(2) screens DatesJan. 22 - Feb. 1, 2001May 21 - June 1 Successes 4 of 10 tasks 7 of 10 tasks

  29. 3. Problems & solutions

  30. Problem 1 • people usually picked the default choices or the first choices without thinking much about it (not always the best strategy for their search)

  31. people used first box, ignored second Example

  32. Solution Default choice is keyword. This casts a broad net for those who forget to make a choice.

  33. Problem 2 • Difference between browse & keyword search not clear

  34. Example ? ?

  35. Solution No need to know difference between keyword and browse search. Combined in one menu.

  36. Problem 3 • it wasn’t clear how to input a search string (people used initial articles, author’s first name first, thought they had to type the entire title)

  37. Example • carefully typed complete title, with article: The Journal of the American Chemical Society

  38. Examples far away

  39. Solution • includeexamples and instructions of how to input data near the search box and in the search menu

  40. Examples for each type Example changes when menu changes.

  41. Examples for each type Example changes when menu changes.

  42. Grouping { Group different title searches, author searches, and subject searches together.

  43. Problem 4 • very busy screens with many buttons were overwhelming for people

  44. Example

  45. Solution • Present choices only where needed • Group navigation links in ways that make sense

  46. Problem 5 • it was difficult to find clickable URLs for electronic titles

  47. No URL on brief results

More Related