1 / 9

Enhancing Usability in NSDL Projects Through Formative Evaluation Methods

This paper discusses the application of formative evaluation methods in the development of digital libraries, focusing on the NSDL (National Science Digital Library) projects. Led by Dr. Manuel A. Pérez-Quiñones from Virginia Tech, the evaluation assesses interfaces during the design process. Key findings highlight persistent issues across various digital libraries, including the impact of wording, user mental models, and task support. Emphasizing the importance of usability evaluation, the paper suggests that early application of usability methods can effectively guide design solutions and improve user experience.

grover
Download Presentation

Enhancing Usability in NSDL Projects Through Formative Evaluation Methods

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Usability Engineering: Formative Evaluation of NSDL Projects • Dr. Manuel A. Pérez-Quiñones • Dept. of Computer Science • Virginia Tech • Blacksburg VA • perez@cs.vt.edu | www.citidel.org

  2. Usability Engineering Methods • Formative evaluation - evaluation of interface as it is being designed • We have used formative evaluation methods to several existing Digital Libraries: iLumina, NCSTRL, CITIDEL • The results showed some consistency on types of errors across digital libraries • Wording, user’s mental models, support for user tasks

  3. Why Usability Evaluation? • You can’t manage what you can’t measure • What to measure: throughput, error rate, task completion, completion rate, user satisfaction, learnability, etc. • Why? • Interactive applications have high development cost and acceptance often depends on usability

  4. Usability Engineering Methods “Use them early and use them often”

  5. Inspection Findings 32 usability problems found using co-inspection

  6. Asked users to evaluate interface with respect to given rules/heuristics Sample heuristics Use single natural language; speak the user’s language; minimize memory load; be consistent; provide feedback; provide clearly marked exits; etc. Heuristic Evaluation

  7. Done with Jr/Sr Computer Science students User language: cc, ccs, corr, msc, citidel, acm, nsf, nsdl, siguccs, collection, digital library… 12/3/03 - NSDL Opening, CITIDEL debuts!“What does this mean? What is NSDL?” …allows you to sequence resources“What does ‘sequence’ means?” Heuristic Evaluation Results

  8. Wider User Testing: CITIDEL

  9. Conclusions • Common problems found: technical jargon, understanding users’ mental model, search vs. browsing usability, portal pass-through problems • Understanding of the potential of UE methods • when applied at the end of the project we can only give you a “ok or not ok” answer, however…when applied earlier, it can help formulate and validate design solutions

More Related