1 / 65

HCI460: Week 1 Lecture

HCI460: Week 1 Lecture. September 9, 2009. Outline. Course overview Overview of usability evaluation methods Heuristic evaluation Cognitive walkthrough Expert evaluation Presenting results Project 1a. Course Overview. Course Overview. Basic Information. Instructors Gavin Lew

Download Presentation

HCI460: Week 1 Lecture

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. HCI460: Week 1 Lecture • September 9, 2009

  2. Outline Course overview Overview of usability evaluation methods Heuristic evaluation Cognitive walkthrough Expert evaluation Presenting results Project 1a

  3. Course Overview

  4. Course Overview Basic Information • Instructors • Gavin Lew • Aga Bojko • Office hours • Wed 5pm – 5:45pm and 9pm – 9:45pm • Lewis 1111, Loop Campus (our classroom) • Email address • hci460@usercentric.com • Course Web page • http://www.usercentric.com/hci460-fall2009.html • Prerequisites • HCI 440 and basic statistics

  5. Course Overview Course Summary • This course surveys methods for evaluating usability of a wide range of products and interfaces. • We will discuss and practice methods: • Heuristic and expert evaluations • Cognitive walkthroughs • Usability testing (formative and summative) • Surveys • Eye tracking • Contextual inquiries • KLM-GOMS • Focus groups

  6. Course Overview Course Goals • To learn how to: • Establish appropriate evaluation objectives • Select evaluation methods that address evaluation objectives and take into account existing constraints • Articulate advantages and disadvantages of usability evaluation methods • Properly use various usability evaluation methods • Present results and prepare effective reports

  7. Course Overview Textbooks • Required Text • Handbook of Usability Testing by Rubin • 1st edition: ISBN 0-471-59403-2 • 2nd edition: ISBN 0-470-18548-1 • Task-Centered User Interface Design: A Practical Introduction by Lewis and Rieman (online text) • Optional Text • Measuring the User Experience: Collecting, Analyzing, and Presenting Usability Metrics by Tullis and Albert (ISBN 0-123-73558-0) • Usability Inspection Methods by Nielsen and Mack (ISBN 0-471-01877-5) • Various papers

  8. Course Overview Projects • Project 1: Expert evaluation • Individual notes • Report • Project 2: Formative usability study • Test plan • Participant screening questionnaire and moderator’s guide • Conducting a test • Report • Project 3: Quantitative comparison study • Test plan • Report

  9. Course Overview Grading • 15% Project 1: Expert evaluation • 25% Project 2: Formative usability study • 15% Project 3: Quantitative comparison study • 10% Take-home midterm quiz • 25% Final exam • 10% Individual contribution to projects (next slide) • Attendance? • Not required but strongly recommended • Projects and exams will cover both lecture and reading material

  10. Course Overview Grading: Individual Contribution to Projects

  11. Course Overview Schedule

  12. Overview of Usability Evaluation Methods

  13. Overview of Usability Evaluation Methods Usability Evaluation Methods in Context Usability testing? Participatory design? Survey? User Experience (UX) Methods Usability Evaluation Methods

  14. Overview of Usability Evaluation Methods Usability Evaluation Methods in Context Do not involve users Involve users Card sorting User Experience (UX) Methods Participatory design Eye tracking Ethnographic research Cognitive walkthrough Summative usability testing Usability Evaluation Methods Formative usability testing Heuristic evaluation Surveys KLM-GOMS Focus groups

  15. Overview of Usability Evaluation Methods UX Methods Involving Users Behavior Eye tracking Summative usability testing Formative usability testing Ethnographic research Qualitative Quantitative Focus groups Participatory design Card sorting Surveys Attitude

  16. Overview of Usability Evaluation Methods Usability Evaluation Methods Involving Users Behavior Eye tracking Summative usability testing Formative usability testing Ethnographic research Qualitative Quantitative Focus groups Surveys Attitude

  17. Overview of Usability Evaluation Methods Usability Evaluation Methods Without Users Usability inspection methods: Heuristic Evaluation KLM-GOMS Qualitative Quantitative Expert Evaluation Cognitive Walkthrough

  18. Overview of Usability Evaluation Methods Why No Users? • Involving users is expensive (time, money). • No users = “discount usability” • Usability inspection is quick, cheap, and useful. • It should be done before a usability test to “clean up” the interface from obvious issues. • If the interface is not cleaned up, the participants will be distracted and will waste time.

  19. Heuristic Evaluation

  20. Heuristic Evaluation What Are Heuristics? • Heuristics = guidelines, principles, rules of thumb • There are many sets of usability heuristics: • Jacob Nielsen’s Heuristics – 1994 (link) • Tognazzini’s First Principles of Interaction Design – 2003 (link) • Jill Gerhardt-Powal’s Cognitive Engineering Principles – 1996 (link) • Research-Based Web Design and Usability Guidelines – 2004 (link)

  21. Heuristic Evaluation Nielsen’s Heuristics Visibility of System Status Match Between System and the Real World User Control and Freedom Consistency and Standards Error Prevention Recognition Rather than Recall Flexibility and Efficiency of Use Aesthetic and Minimalist Design Error Recovery Help and Documentation

  22. Heuristic Evaluation Nielsen’s Heuristics: #1 Clicking on Map History does not display anything if the history is empty. There is no indication of location within the application. • Visibility of System Status • The system should always keep users informed about what is going on, through appropriate feedback within reasonable time.

  23. Heuristic Evaluation Nielsen’s Heuristics: #2 The order of the controls is incorrect. Making a selection in the dropdown depends on whether or not the Enable Auto Update is selected. “Processing weather data” is a system-oriented term that appears when the user clicks on “Update Weather Now.” • Match Between System and the Real World • The system should use phrases and concepts familiar to the user. Follow real-world conventions, making information appear in a natural and logical order.

  24. Heuristic Evaluation Nielsen’s Heuristics: #3 Processing can take a while and there is no way to cancel the action or move the box to the side. • User Control and Freedom • Users often choose system functions by mistake and will need a clearly marked "emergency exit" to leave the unwanted state without having to go through an extended dialogue. Support undo and redo.

  25. Heuristic Evaluation Nielsen’s Heuristics: #4 Placement of the Browse button does not follow standards. The Browse button generally appears to the right of the file textbox. • Consistency and Standards • Users should not have to wonder whether different words, situations, or actions mean the same thing. Follow platform conventions.

  26. Heuristic Evaluation Nielsen’s Heuristics: #5 The text field is too long and accepts 256 digits, suggesting that the required input should be longer than five digits. • Error Prevention • Even better than good error messages is a careful design which prevents a problem from occurring in the first place.

  27. Heuristic Evaluation Nielsen’s Heuristics: #6 Most of the icons are not intuitive and they are not labeled. Users have to remember what each icon means or hover over them, which negatively impacts efficiency. • Recognition Rather than Recall • Make objects, actions, and options visible. The user should not have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate.

  28. Heuristic Evaluation Nielsen’s Heuristics: #7 The Close is difficult to see and the clickable area associated with it is very small. Hitting the target area is difficult and may take a few tries. The window does not close when Alt-F4 is pressed, which is inconsistent with other Windows applications. • Flexibility and Efficiency of Use • Accelerators -- unseen by the novice user -- may often speed up the interaction for the expert user such that the system can cater to both inexperienced and experienced users. Allow users to tailor frequent actions.

  29. Heuristic Evaluation Nielsen’s Heuristics: #8 It is unnecessary to include the word “Map” on all buttons. It is clear that this is a list of maps and all the actions will be performed on maps. The tray tooltip is unnecessarily long for the amount of information it conveys. • Aesthetic and Minimalist Design • Dialogues should not contain information which is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility.

  30. Heuristic Evaluation Nielsen’s Heuristics: #9 After entering an incorrect zip code, when users click on the “Update Weather Now” icon, they see an error message. The message provides no information on what the problem is and how to fix it. • Error Recovery • Help users recognize, diagnose, and recover from errors. Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution.

  31. Heuristic Evaluation Nielsen’s Heuristics: #10 The Help document is very long and requires lengthy scrolling to access certain sections. This specifically impacts users who are unaware of the Find feature in Notepad. Additionally, the sections are not labeled with the established system in the Table of Contents (A, B, C, … etc.) making the document difficult to search. • Help and Documentation • Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on the user's task, list concrete steps to be carried out, and not be too large.

  32. Heuristic Evaluation Nielsen’s Heuristics: Exercise • Visibility of System Status • The system should always keep users informed about what is going on, through appropriate feedback within reasonable time. • Match Between System and the Real World • The system should use phrases and concepts familiar to the user. Follow real-world conventions, making information appear in a natural and logical order. • User Control and Freedom • Users often choose system functions by mistake and will need a clearly marked "emergency exit" to leave the unwanted state without having to go through an extended dialogue. Support undo and redo. • Consistency and Standards • Users should not have to wonder whether different words, situations, or actions mean the same thing. Follow platform conventions. • Error Prevention • Even better than good error messages is a careful design which prevents a problem from occurring in the first place. • Recognition Rather than Recall • Make objects, actions, and options visible. The user should not have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate. • Flexibility and Efficiency of Use • Accelerators (unseen by the novices) – may speed up the interaction for the experts such that the system can cater to both inexperienced and experienced users. Allow users to tailor frequent actions. • Aesthetic and Minimalist Design • Dialogues should not contain information which is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility. • Error Recovery • Help users recognize, diagnose, and recover from errors. Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution. • Help and Documentation • Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on the user's task, list concrete steps to be carried out, and not be too large. • Form groups of 3 in class. • Online students can do this exercise individually. • Each group will get a handout with Nielsen’s 10 heuristics and screenshots of two interfaces. • With the heuristics in mind, find a few usability issues with each interface. • Assign an appropriate heuristic to each issue. • There may be more than one heuristic per issue.

  33. Heuristic Evaluation Nielsen’s Heuristics: Exercise – Interface 1 • Dog age calculator • http://www.bowwow.com.au/calculator/index.asp

  34. Heuristic Evaluation Nielsen’s Heuristics: Exercise – Interface 2a • Currency converter: • http://www.oanda.com/convert/classic

  35. Heuristic Evaluation Nielsen’s Heuristics: Exercise – Interface 2b • The new converter: Have things improved? • http://www.oanda.com/currency/converter/

  36. Heuristic Evaluation Research-Based Guidelines • Developed by the National Cancer Institute in the US Department of Health and Human Services • Many contributors • Available on usability.gov and in a book • 2009 guidelines • Web-specific • Mostly for informational Web sites • Each guideline has two ratings

  37. Heuristic Evaluation Research-Based Guidelines: Ratings • Relative importance • How important is the guideline to the overall success of a Web site? • Based on opinions of 16 experts • Strength of evidence • Team of researchers evaluated existing research evidence for each guideline and rated it. Strong research support Moderate research support Weak research support Strong expert opinion support Weak expert opinion support

  38. Heuristic Evaluation Research-Based Guidelines: Usage • How to use the guidelines? • Pair them down by making a checklist of: • Top __ guidelines based on importance • Top __ guidelines based on research evidence • Most relevant guidelines to your application • E.g., 30 guidelines related to forms • Use guidelines with 4 – 5 strength of evidence ratings to convince others who do not believe in usability. • Credibility

  39. Heuristic Evaluation Conducting an Evaluation: # of Evaluators • One person cannot find all usability problems. • Different people find different usability problems. • Heuristic evaluation is most effective with multiple evaluators. • Nielsen recommends 3 – 5 evaluators:

  40. Heuristic Evaluation Conducting an Evaluation: Steps • Each evaluator examines the interface independently (no communication with others). • He/she can use tasks/scenarios. • At least two passes: • 1st pass: to get a feel for the flow and scope • 2nd pass: to focus on specifics • Output: list of problems and heuristics that they violate • Evaluators meet for a debriefing session. • Discuss each issue and the violated heuristic. • Agree on the final list of issues (ensure there is no redundancy). • Brainstorm recommendations. • Each evaluator independently assigns a severity rating to each issue. • One evaluator combines all severity ratings.

  41. Heuristic Evaluation Conducting an Evaluation: Severity Ratings • Severity ratings help with prioritization of issues. • Nielsen’s 0 – 4 scale: • 0: I don’t agree that this is a usability problem at all • 1: Cosmetic problem only • Need not be fixed unless extra time is available • 2: Minor usability problem • Fixing this should be given low priority • 3: Major usability problem • Important to fix, so should be given high priority • 4: Usability catastrophe • Imperative to fix this

  42. Heuristic Evaluation Conducting an Evaluation: Severity Ratings • When assigning severity ratings, consider: • The frequency with which the problem occurs. • Is it common or rare? • The impact of the problem if it occurs. • Will it be easy or difficult for the users to overcome? • The persistence of the problem. • Is it a one-time problem that users can overcome once they know about it or will they be repeatedly bothered by it?

  43. Heuristic Evaluation Conducting an Evaluation: Severity Ratings A high-severity issue is a major problem that prevents users from finding, recognizing, or using key features and completing critical tasks. It is critical to resolve this issue because it blocks the use of functionality. A medium-severity issue can be a common problem that has a negative impact on users’ overall efficiency. It often creates inconvenience for the user. It is important to resolve this issue; otherwise user confusion and frustration will result. A low-severity issue typically does not affect user performance but causes irritation and has impact on user’s opinion of the product/company. Resolving the issue will generally improve the user experience. HIGH MEDIUM LOW • Another scale:

  44. Cognitive Walkthrough

  45. Cognitive Walkthrough Overview • Walkthroughs are a formalized way of imagining people's thoughts and actions when they use an interface for the first time. • They are more structured than a heuristic evaluation. • Walkthroughs help iterate the interface; they do not validate it.

  46. Cognitive Walkthrough What’s Needed • Interface or its prototype • E.g., Weather Watcher 5.4B • Task description (representative tasks) • E.g., Set the zip code for which you would like Weather Watcher to retrieve the weather.) • Complete, written list of the actions needed to complete the task • E.g., click on View Options menu item, click on Active City link, type zip code, click on OK button… • User description • E.g., wide range of computer users (from novices to experts)

  47. Cognitive Walkthrough Steps • For each action, ask these four questions: • Will users try to produce the effect the action has? • Will users see the control (button, menu item etc.) for the action? • Will users recognize that the control produces the effect? • After the action is taken, will users understand the feedback they get, so that they know the action took place?

  48. Cognitive Walkthrough Example: Setting Zip Code in Weather Watcher

  49. Cognitive Walkthrough Example: Setting Zip Code in Weather Watcher

  50. Cognitive Walkthrough Keep in Mind • You should start with a list of correct actions needed to complete a given task. • You can explore the interface to identify those actions BUT that's not the walkthrough. • The walkthrough shows what the user may have trouble with rather than what the user will do when that problem arises.

More Related