1 / 79

Evaluation Using User Studies

Evaluation Using User Studies. Usability. Is it a “good” interface? In what ways? Usability: How well users can use the system’s functionality Dimensions of usability: Learnability : is it easy to learn? Efficiency: once learned, is it fast to use?

bryga
Download Presentation

Evaluation Using User Studies

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation UsingUser Studies

  2. Usability • Is it a “good” interface? • In what ways? • Usability: • How well users can use the system’s functionality • Dimensions of usability: • Learnability: is it easy to learn? • Efficiency: once learned, is it fast to use? • Memorability: is it easy to remember what you learned? • Errors: are errors few and recoverable? • Satisfaction: is it enjoyable to use?

  3. User and task requirements:cognitive models - keystroke level model • Predict performance times for common operations based on knowledge of human motor system • 7 basic operators K - keystroking - actually striking keys B - pressing a mouse button P - pointing, moving the mouse at a target H - homing - switching the hand between mouse and keyboard D - drawing lines using the mouse M - mentally preparing for physical action R - system response (may be ignored)

  4. M-operators in KLM • Initiating a task – pause while user considers what should be done • Making a strategy decision – which option to take? • Remembering something – e.g., a filename • Finding something on the screen (here the location is not well known) • Verifying that what has been done or is about to be done is correct

  5. Typical KLM times Operator K B P H D M R Remarks Press key good typist (90 wpm) average typist (40 wpm) non-typist Mouse button press down or up click Point with mouse Specific movement Average movement Home hands to/from keyboard Drawing Mentally prepare Response from system Time (s) 0.12 0.28 1.20 0.10 0.20 Fitts’ law 1.10 0.40 domain dependent 1.20 measure

  6. Example of KLM • Deleting a file from the desktop on a Mac • Method 1: drag to the wastebasket • Operator sequence: • Initiate the deletion (M) • Find the file icon (M) • Point to file icon (P) • Press and hold mouse button (B) • Drag file icon to wastebasket (P) • Release mouse button (B) • Total predicted time = 2M + 2P + 2B = 4.8 secs

  7. Example of KLM • Deleting a file from the desktop on a Mac • Method 2: using an accelerator key • Operator sequence: • Initiate the deletion (M) • Find the file icon (M) • Point to the file icon (P) • Click – i.e., press and release mouse button (BB) • Move hand to keyboard (H) • Press ‘Apple’ and ‘Delete’ keys (KK) • Move hand back to mouse (H) • Total predicted time = 1P + 2B +2K + 2M +2H = 5.1 seconds

  8. Design implications from Gestalt Psychology • Proximity – group related items close together and separate unrelated ones • Alignment – place related items along an imaginary line. Align items of equal importance and indent subordinate ones • Consistency – make related items look the same • Contrast – make unrelated items look different

  9. What do you see? similarity continuity proximity symmetry closure

  10. Original

  11. Proximity

  12. Alignment

  13. Repetition

  14. Examples of Bad Design … and Why • Elevator controls and labels on the bottom row all look the same, so it is easy to push a label by mistake instead of a control button • People do not make same mistake for the labels and buttons on the top row. Why not? From: www.baddesigns.com

  15. Visibility - Example • Control panel for an elevator • How does it work? • Push a button for the floor you want? • Nothing happens - Push any other button? Still nothing. • What do you need to do? • It is not visible as to what to do!

  16. Visibility …need to insert room card in slot by buttons to get elevator to work! How would to make this action more visible? • Make card reader more obvious • Provide an auditory message that says what to do (which language?) • Provide a big label next to the card reader that flashes when someone enters • Make relevant parts visible • Make what has to be done obvious

  17. Logical or ambiguous design? • Where do you plug the mouse? • Where do you plug the keyboard? • top or bottom connector? • Do the color coded icons help? From: www.baddesigns.com

  18. How to design more logically - A. provides direct adjacent mapping between icon and connector - B. provides color coding to associate the connectors with the labels

  19. Mapping • Relationship between controls and their movements and the results in the world • Why is this a poor mapping of control buttons?

  20. Mapping • Why is this a better mapping? • The control buttons are mapped better onto the sequence of actions of fast rewind, rewind, play and fast forward

  21. Mapping • Which controls go with which rings (burners)? A B C D

  22. Why is this a better design?

  23. Internal and external consistency • Internal consistency refers to designing operations to behave the same within an application • Difficult to achieve with complex interfaces • External consistency refers to designing operations, interfaces, etc., to be the same across applications and devices • Very rarely the case, based on different designer’s preference • Most successful in product families (e.g MS Office) • Op. Sys. vendors may define style guidelines

  24. External Inconsistency … • Keypad numbers layout (a) phones, remote controls (b) calculators, computer keypads 8 9 1 2 7 3 4 5 6 4 5 6 8 9 1 2 7 3 0 0

  25. Usability Problem Example: Unexpected Occurrence of Events

  26. Usability Measures – 5 Often Used • Time to learn • How long does it take for typical members of the community to learn relevant task? • Speed of performance • How long does it take to perform relevant benchmarks? • Rate of errors by users • How many & what kinds of errors are made during benchmark tasks? • Retention over time • Frequency of use and ease of learning help make for better user retention • Subjective satisfaction • Do they like it? • Allow for user feedback via interviews, free-form comments and satisfaction scales

  27. state evident inmechanical buttons rotary knobs reveal internal state and can be controlled by both user and machine compliant interaction

  28. Evaluation Techniques • Evaluation • tests usability and functionality of system • occurs in laboratory, field and/or incollaboration with users • evaluates both design and implementation

  29. Cognitive Walkthrough Proposed by Polson et al. • evaluates design on how well it supports user in learning task • usually performed by expert in cognitive psychology • expert ‘walks though’ design to identify potential problems using psychological principles

  30. Cognitive Walkthrough (ctd) • For each task walkthrough considers • what impact will interaction have on user? • what cognitive processes are required? • what learning problems may occur? • Analysis focuses on goals and knowledge: does the design lead the user to generate the correct goals?

  31. Questions Cognitive Walkthrough • Is the next goal clear at this stage? • Is the appropriate action obvious? • Is it clear that this action leads to the goal? • What problems are there in performing the action?

  32. Cognitive Walkthrough: How (cont.) • Walk through the task while answering these Questions: • Will the user know what to do? • Will the user see how to do it? • Will the user understand from feedback whether their action was correct?

  33. Heuristic Evaluation • Proposed by Nielsen and Molich. • usability criteria (heuristics) are identified • design examined by experts to see if these are violated • Example heuristics • system behaviour is predictable • system behaviour is consistent • feedback is provided • Heuristic evaluation `debugs' design.

  34. The Procedure • Several independent evaluators • each uses the same checklist • each works alone • each makes a list of usability problems • Combine lists into a single list • works well as a group activity

  35. “Think Aloud” Protocols • “Single most valuable usability engineering method” • Get user to continuously verbalize their thoughts • Find out why user does things • What thought would happen, why stuck, frustrated, etc. • Encourage users to expand on whatever interesting • But interferes with timings • May need to “coach” user to keep talking • Unnatural to describe what thinking • Ask general questions: “What did you expect”, “What are you thinking now” • Not: “What do you think that button is for”, “Why didn’t you click here” • Will “give away” the answer or bias the user • Alternative: have two users and encourage discussion

  36. Analyzing the data • Numeric data • Example: times, number of errors, etc. • Tables and plots using a spreadsheet • Look for trends and outliers • Organize problems by scope and severity • Scope: How widespread is the problem? • Severity: How critical is the problem?

  37. Physiological measurements • emotional response linked to physical changes • these may help determine a user’s reaction to an interface • measurements include: • heart activity, including blood pressure, volume and pulse. • activity of sweat glands: Galvanic Skin Response (GSR) • electrical activity in muscle: electromyogram (EMG) • electrical activity in brain: electroencephalogram (EEG)

  38. 1. Visibility of system status • Keep users informed about what is going on • What page they are on and what part of a process • Provide appropriate feedback • About what system is doing, and how input is being interpreted • E.g. in XXX product,  • "really ungroup?" -- loses associated behavior

  39. Eye tracking • head or desk mounted equipment tracks the position of the eye • eye movement reflects the amount of cognitive processing a display requires • measurements include • fixations: eye maintains stable position. Number and duration indicate level of difficulty with display • saccades: rapid eye movement from one point of interest to another • scan paths: moving straight to a target with a short fixation at the target is optimal

  40. 2. Match between system and the real world • Terminology in user’s language • Not computer terminology • Language from user’s perspective • “You have bought…” not “We have sold you…” • Use common words, not “techno-jargon” • Error messagesand feedback refer touser objects • Allow full-length names  • E.g. “Hit any key to continue”

  41. 3. User control and freedom • Easy to abort: Cancel buttons • Cancel order, cancel changing a profile • Easy to Undo • Web issue: what does “Back” button do? • Example: many sites can get confused if use back button • Users (even experts) will make errors  • E.g. in XXX product,  • no way to get out of editing a text field

  42. 4. Consistency and standards • Same command always have the same effect  • Locations for information, names of commands  • Give the user a mental model of the system • Size, location, color, wording, function, sequencing, etc. • E.g., color purple? • Following standards helps • Web: use templates or CSS, style guides • Seems easy, but often not followed; e.g. in XXX  • naming "F#1.C#1" vs. "F#1", "C#1" • consistent with industry standards: e.g., Copy purple?

  43. 5. Error prevention • Selection rather than entry • www.Expedia.com: question, when ambiguous city (e.g. Columbus) • Remove or gray-out illegal choices  • Not common for web pages • Confirmation • Avoid modes • Definition: same user action has different results  • Make unavoidable modes visible  • E.g. Typing "daytime" to a mail program

  44. 6. Recognition rather than recall • Make objects, actions, options visible • See and pick it, not generate it • Short-term memory= 7 ± 2 items; 30 sec to 2 min  • unless interrupted  • Menus rather than type-in (but short enough)  • Prompts provide format and limits • Don't require retyping of remembered information  • Pervasive, generic rules (cut/paste)  • E.g. in Aegis, remembering altitude

  45. Example:prompts • What is a DTIC user code and how to get one?

  46. Example: prompts (Print)

  47. Error Messages, cont. • Blame the system, not the user  • “Unrecognized” vs. “illegal” command  • No humor or snide comments  • Easy error recovery • Can have multiple levels of messages • E.g. in XXX product, “can't save file” — why not?

More Related