1 / 31

Show and Tell: Usability Testing Shows Us the User Experience and Tells Us What to Do About It

Show and Tell: Usability Testing Shows Us the User Experience and Tells Us What to Do About It. Carol Barnum Director of the Usability Center & Professor of Information Design. The problem. “most major producers of e-learning are not doing substantial usability testing…

Download Presentation

Show and Tell: Usability Testing Shows Us the User Experience and Tells Us What to Do About It

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Show and Tell: Usability Testing Shows Us the User Experience and Tells Us What to Do About It Carol Barnum Director of the Usability Center & Professor of Information Design

  2. The problem “most major producers of e-learning are not doing substantial usability testing… In fact, we don’t seem to even have a way to talk about usability in the context of e-learning.” Michael Feldstein, “What is ‘usable’ e-learning?” eLearn Magazine (2002)

  3. What we’ll cover • Aligning our terminology • Using the tools in the UCD arsenal • Overcoming obstacles to utesting • Learning from our users

  4. How do you measure usability? • useful, usable, desirable – pick any 2 • knowledge acquisition/mastery • an attribute • learnability • findability • motivation • satisfaction

  5. QA versus UA • QA – What do we mean? • UA – What do we mean?

  6. Usability Testing Focus is on user User’s satisfaction with product Ease of use Ease of self-learning Intuitiveness of product QA Testing Focus is on product Functional operation tests for errors Performance/benchmark testing Click button, get desired action UA versus QA

  7. What is usability? • “The extent to which a product can be used by specified users to achieve specified goals in a specified context of use with effectiveness, efficiency, and satisfaction.” (ISO 9241-11 International Organization for Standardization) • “The measure of the quality of the user experience when interacting with something—whether a Web site, a traditional software application, or any other device the user can operate in some way or another.” (Nielsen, “What is ‘Usability’”?)

  8. HE is one tool • Heuristic Evaluation • definition • examples • Jakob Nielsen (www.usit.com/alertbox) • Quesenbery’s 5 E’s (www.wqusability.com) • Dick Miller (www.stcsig.org/usability)

  9. Personas - another tool • Definition • Examples • Cooper (www.cooper.com/content/insights/newsletters_personas.asp) • HE + personas = more powerful review • eLearn Magazine • “Designing Usable, Self-Paced e-Learning Courses: A Practical Guide” (2006) Michael Feldstein • “Want Better Courses? Just Add Usability” (2006) Lisa Neal and Michael Feldstein

  10. Know thy user for he is not thyself • “Egocentric intuition fallacy” – Tom Landauer • Who are the Millenials? Gen Next? • Gen Y? Gen Why?? • Procastinating 40’s housewife transitioning back to the work force? • Is this you?

  11. The argument against utesting • time is money • money is money • HE is a cheap alternative • Discount usability method • Uncovers violations against rules • Cleans up the interface • Satisfies “usability by design”

  12. HE is a tool, not a goal • HE benefits • HE processes • Personas • Scenarios of use

  13. Achieving usability - big picture Shilwant, S. & Haggarty, A. “Usability Testing for E-Learning,” Chief Learning Officer Magazine, August 2005 (www.clomedia.com)

  14. Let’s hear it from the user • User experience cannot be imagined • What can the user show us? • how does the user navigate the online environment? • How does the user find content? • how does the user respond to content? • What can the user tell us? • think aloud protocol • What are the user’s perceptions? • listen, observe, learn • evaluate survey responses with caution

  15. Build UX into process • How many users does it take? • cast of thousands – engineering model • five or fewer - Nielsen discount model • RITE method - Rapid Iterative Testing and Evaluation – Microsoft gaming model

  16. Commonalities • Rapid • Iterative • Developmental • Affordable

  17. Now for show and tell . . .

  18. Heuristics suggest test plan • General navigation within Vista and a class • Consistency with general web design and hyperlink conventions • Performing class-related tasks, such as posting assignments • Responding to discussion board messages • Using non-class related tools, such as Campus Bookmarks, Calendar, To Do List

  19. Click on logo opens new web page (webct.com). Users may think of this page as a home page, since this is the first page users see after submitting vista url. They may expect this logo to represent a “link to home.” This important instruction is not significantly different from the adjacent text. User must scroll to see the complete listing These lines seems to clutter this space and instead of acting to delineate the listing. They cause the text to become less discernable by reducing figure-ground contrast. Extensive use of “Mouse-over” links. Not all the items in this list are institutions.

  20. This text does not have enough size contrast to be effective Mouse-over links. Strangely, the logo is now not an active link to webct.com Buttons links with mouse-over effect. Colored hypertext links Inconsistent link design may confuse users; users may not be able to readily distinguish what is a link and what is not.

  21. “File tab” functions as non-traditional “home” button; file tabs not used to navigate anywhere else on site; inconsistent navigation may confuse users. The purpose of these text links and their proximity to the iconic links is unclear. University identifier now missing Users may not understand the meaning of these icons. Some icons seem to represent their meaning better than others. The relevance of some content is questionable Some of these tables have links and some do not; also, some have icons and some do not. The meaning and relevance of some titles are unclear. Introduction on iconic links; adjacent text not a link.

  22. Relevance Text Size Contrast “Go to” does not let you “go to” anywhere I’d like to, i.e., the user lacks control. This location now features a tool bar of icons with text descriptors. Though it says I’m on the “Home Page,” I expected the home page to be “MyWebCT.” Some links now feature “Alt” tags; some do not. If this is a tool bar, why does it include navigation elements?

  23. Why is this text repeated here and above, too? In terms of hierarchy, I’m not sure of the relevance of this feature or its placement here. The text, “search,” inside the box with the “go” arrow beside it might mean to some users click the arrow to go search, especially “search mail” though you can’t really know for sure. Clicking the arrow send you to an error page. See next slide. Presumably, a central task users may want to perform is compose mail, yet this button does not significantly differentiate itself from any of the other buttons on the page. The meaning of the bold and bracketed numbers may not be apparent to users. The information appearing here does not add to most user’s experience, except maybe to confuse and frighten them.

  24. The good, the bad, the ugly

  25. “The Good” • Found e-mail icon quickly (3 users) • Users remarked “one-stop shopping” and “pretty cool” (2 users) • Compared discussion board positively to WebCT (2 users) • Address book of mail recipients “very handy” (2 users) • Liked hover help that displays over buttons (3 users)

  26. “The bad” & “the ugly” • 95 separate instances of trouble, frustration, or inability to complete task • Documented findings that affected 3+ users (from sample of 6) • Noted other issues that affected fewer users Ranked in priority of severity • 1 – Catastrophe: unable to complete task • 2 – Serious: caused confusion or delay • 3 – Minor: little effect on usability

  27. Teaching moment • You are not your user • Discount model works (“Guerilla HCI”) • Heuristic evaluation is one tool • Companion is usability testing • Let your users show and tell you about their experience

  28. Your guide Carol Barnum cbarnum@spsu.edu

More Related