Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
Usability • Is it a “good” interface? • In what ways? • Usability: • How well users can use the system’s functionality • How well can the interface be used – it’s just that simple
Examples of Bad Design … and Why • Elevator controls and labels on the bottom row all look the same, so it is easy to push a label by mistake instead of a control button • People do not make same mistake for the labels and buttons on the top row. Why not?
Logical or ambiguous design? • Where do you plug the mouse? • Where do you plug the keyboard? • top or bottom connector? • Do the color coded icons help? From: www.baddesigns.com
Internal and external consistency • Internal consistency refers to designing operations to behave the same within an application • Difficult to achieve with complex interfaces • External consistency refers to designing operations, interfaces, etc., to be the same across applications and devices • Very rarely the case, based on different designer’s preference • Most successful in product families (e.g MS Office) • Op. Sys. vendors may define style guidelines
External Inconsistency … • Keypad numbers layout (a) phones, remote controls (b) calculators, computer keypads 8 9 1 2 7 3 4 5 6 4 5 6 8 9 1 2 7 3 0 0
Usability Measures – 5 Often Used • Time to learn • How long does it take for typical members of the community to learn relevant task? • Speed of performance • How long does it take to perform relevant benchmarks? • Rate of errors by users • How many & what kinds of errors are made during benchmark tasks? • Retention over time • Frequency of use and ease of learning help make for better user retention • Subjective satisfaction • Do they like it? • Allow for user feedback via interviews, free-form comments and satisfaction scales
state evident inmechanical buttons rotary knobs reveal internal state and can be controlled by both user and machine compliant interaction
Evaluation Techniques • Evaluation • tests usability and functionality of system • occurs in laboratory, field and/or incollaboration with users • evaluates both design and implementation
Cognitive Walkthrough Proposed by Polson et al. • evaluates design on how well it supports user in learning task • usually performed by expert in cognitive psychology • expert ‘walks though’ design to identify potential problems using psychological principles
Cognitive Walkthrough Example: Code walkthrough in Software Engineering • To review a segment of program code by the expert other than programmer • Sequence of actions • Selecting a set of program codes • Check certain characteristics, such as, coding style, spelling variables conventions, function declarations, system-wide invariants etc.
Cognitive Walkthrough (ctd) • For each task walkthrough considers • what impact will interaction have on user? • what cognitive processes are required? • what learning problems may occur? • Analysis focuses on goals and knowledge: does the design lead the user to generate the correct goals?
Questions Cognitive Walkthrough • Is the next goal clear at this stage? • Is the appropriate action obvious? • Is it clear that this action leads to the goal? • What problems are there in performing the action?
Cognitive Walkthrough: How (cont.) • Walk through the task while answering these Questions: • Will the user know what to do? • Will the user see how to do it? • Will the user understand from feedback whether their action was correct?
Heuristic Evaluation • Proposed by Nielsen and Molich. • usability criteria (heuristics) are identified • design examined by experts to see if these are violated • Example heuristics • system behaviour is predictable • system behaviour is consistent • feedback is provided • Heuristic evaluation `debugs' design.
The Procedure • Several independent evaluators • each uses the same checklist • each works alone • each makes a list of usability problems • Combine lists into a single list • works well as a group activity
HE: 10 Usability Heuristics • Visibility of system status • The system should always keep users informed about what is going on, through appropriate feedback within reasonable time • Match between system and the real world • The system should speak the users' language, with words, phrases and concepts familiar to the user, rather than system-oriented terms. Follow real-world conventions, making information appear in a natural and logical order • User control and freedom • Users often choose system functions by mistake and will need a clearly marked "emergency exit" to leave the unwanted state without having to go through an extended dialogue. Support undo and redo
HE: 10 Usability Heuristics 4. Consistency and standards • Users should not have to wonder whether different words, situations, or actions mean the same thing. Follow platform conventions 5. Error prevention • Even better than good error messages is a careful design which prevents a problem from occurring in the first place 6. Recognition rather than recall • Make objects, actions, and options visible. The user should not have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate
HE: 10 Usability Heuristics 7. Flexibility and efficiency of use • Accelerators -- unseen by the novice user -- may often speed up the interaction for the expert user such that the system can cater to both inexperienced and experienced users. Allow users to tailor frequent actions 8. Aesthetic and minimalist design • Dialogues should not contain information which is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility 9. Help users recognize, diagnose, and recover from errors • Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution
HE: 10 Usability Heuristics 10. Help and documentation • Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on the user's task, list concrete steps to be carried out, and not be too large • Jacob Nielsen originally developed the heuristics for heuristic evaluation in collaboration with Rolf Molich in 1990. Nielsen since refined the heuristics based on a factor analysis of 249 usability problems to derive a set of heuristics with maximum explanatory power, resulting in this revised set of heuristics
“Think Aloud” Protocols • “Single most valuable usability engineering method” • Find out why user does things • What thought would happen, why stuck, frustrated, etc. • Encourage users to expand on whatever interesting • But interferes with timings • May need to “coach” user to keep talking • Unnatural to describe what thinking • Ask general questions: “What did you expect”, “What are you thinking now” • Not: “What do you think that button is for”, “Why didn’t you click here” • Will “give away” the answer or bias the user • Alternative: have two users and encourage discussion
Physiological measurements • emotional response linked to physical changes • these may help determine a user’s reaction to an interface • measurements include: • heart activity, including blood pressure, volume and pulse. • electrical activity in brain: electroencephalogram (EEG)
Eye tracking • head or desk mounted equipment tracks the position of the eye • eye movement reflects the amount of cognitive processing a display requires • measurements include • fixations: eye maintains stable position. Number and duration indicate level of difficulty with display • saccades: rapid eye movement from one point of interest to another • scan paths: moving straight to a target with a short fixation at the target is optimal
Example:prompts • What is a DTIC user code and how to get one?
KLM Operators K Press a key or button P Point to a target on the display H Home hands on input device D Draw a line segment M Mentally prepare for an action R (system response time)
Operator Estimates Keystroke determined by typing speed 0.28s for average typist (40 wpm), 0.08s for fast typist (155 wpm), 1.20s for worst typist Pointing determined by Fitts’ Law (or general approximation) T = a + b log (d/s +1) OR T = 1.1s Drawing determined by Steering Law T = a + b (d/s)
Operator Estimates Homing estimated by measurement T = 0.36s (between keyboard and mouse) Mental prep estimated by measurement T = 1.35s (estimated by taking the total task time, subtracting physical operator time, and dividing by the number of “chunks” of activity) Adapted from Rob Miller
Example: Deleting a Word Using Delete M P [start of word] K [click] H M K [Del] x n [length of word] Total: 2M + P + H + (n+1) K = 4.44 + 0.28n sec Using Shift-Click M P [start of word] K [click] M P [end of word] K [shift] K [click] H [to keyboard] M K [Del] Total: 3M + 2P + 4K = 7.37 sec
In-class Exercise Generate a KLM model for deleting a file from your desktop Estimate the time it would take using the provided operator times Compare the predicted time with the actual time
Example • Consider the searching a Word document for all occurrences of a four-letter word, and replacing it with another four-letter word.
Error Messages • Blame the system, not the user • “Unrecognized” vs. “illegal” command • Easy error recovery • Can have multiple levels of messages • E.g. in XXX product, “can't save file” — why not?
Another Bad Example http://stinet.dtic.mil/