1 / 20

Human-Computer Interaction

Human-Computer Interaction. Usability Evaluation: 1 Introduction and Analytic Methods. Lecture Overview. Definition and motivation Industrial practice and interest Types of evaluation Analytic methods Heuristic evaluation Keystroke level model Cognitive walkthrough.

dunne
Download Presentation

Human-Computer Interaction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Human-Computer Interaction Usability Evaluation: 1 Introduction and Analytic Methods

  2. Lecture Overview • Definition and motivation • Industrial practice and interest • Types of evaluation • Analytic methods • Heuristic evaluation • Keystroke level model • Cognitive walkthrough

  3. Evaluation: Definition and Motivation • Definition • Gathering information about usability or potential usability of a system • Motivation • Suggest improvements or confirm acceptability of interface and/or supporting materials • Ensure competitive reputation • ‘Users will evaluate your interface sooner or later’ (Hix and Hartson, 1993) • Match or exceed usability of competitor’s products (and statutory requirements)

  4. MUSiC Project - Metrics for Usability Standards in Computing • Early 1990’s - European survey • Generally high appreciation of importance of usability evaluation • Knowledge of evaluation methods limited • Lack of metrics a major problem (after time and money limitations) • Intuitiveness of a product and ability to learn it quickly without manuals is a an increasingly important usability factor

  5. Reasons why Interface Evaluation is Often Omitted or Poorly Performed • Assumption designer’s personal behaviour is ‘representative’ • Implicit unsupported assumptions about human performance • Acceptance of traditional/standard interface design • Postponement of evaluation until ‘a more convenient time’ • Lack of expertise in analysing experiments

  6. What to Evaluate • Usability specifications at all lifecycle stages • Initial designs (pre-implementation) • Partial • Integrated • Prototype at various stages • Final(?) implementation • Documentation

  7. Formative Evaluation • Repeatedly, as development proceeds • Purpose: to support iterative refinement • Nature: structured, but fairly informal • Average of 3 major ‘design-test-redesign’ cycles, with many minor cycles to check minor changes The earlier poor design features or errors are detected, the easier and cheaper they are to correct

  8. Summative Evaluation • Once, after implementation (or nearly so) • Important in field or ‘beta’ testing • Purpose: quality control - product is reviewed to check it meets • Own specifications • Prescribed standards, e.g. Health and Safety, ISO • Nature: formal, often involving statistical inferences

  9. Where to Evaluate • Designer’s mind • Discussion workshops • Representative workplace • Experimental laboratory

  10. How to Evaluate:Evaluation Methods Method Interface User Involvement development Analytic Specification No users Expert Specification or No users prototype Role playing only Observational Simulation or Real users prototype Survey Simulation or Real users prototype Experimental Normally full Real users prototype costs Empirical

  11. Types of Data • Quantitative data • Objective measures • Directly observed • E.g. time to complete, accuracy of recall • User performances or attitudes can be recorded in a numerical form • Qualitative data • Subjective responses • Reports and opinions that may be categorized in some way but not reduced to numerical values

  12. Measurement Tools • Semi-structured interview • Questionnaire - personal /postal administration • Incident diary • Feature checklist • Focus group • Think-aloud • Interactive experiment Compare on: • Cost • Number of subjects

  13. Analytic Evaluation Advantages Disadvantages • Usable early in design • Little or no advance planning • Cheap • Quick • Focus on problems • Lack of diagnostic output for redesign • Encourages strengthening of existing solution • Broad assumptions of users’ cognition • Can be difficult for evaluator

  14. Analytic Evaluation:Heuristic Evaluation • Assess design against known usability criteria • e.g. Brown, 1994 • Coffee break test • Data overload • Engineering model • Help • Mode test • Dead ends • Unable to complete • Stupid questions • Jotting test • Standalone test • Maintenance test • Consistency test • Reversibility • Functionality test • Knowledge of completion

  15. Analytic Evaluation:Keystroke Level Model (Card et al., 1980) • Best known analytic evaluation technique • Simple way of analysing expert user performance - usually of unit tasks - say 20 secs • Applies constants to operations - total gives completion time for error free dialogue sequence • Proven predictive validity (+ 20%) • Human motor system is well understood • No high-level mental activity

  16. KLM Constants Averages - modify to suit • Operator Meaning Time(secs) • K Press key (good) 0.12 0.28 1.20 (poor) • B Mouse button press • Down or up 0.10 • Click 0.20 • P Point with mouse 1.10 (Fitt’s law: K log2(Distance/Size + 0.5) • H Hand to keyboard or mouse 0.40 • M Mental preparation for physical 1.35 action - 1 M per ‘chunk’ • R System response time Measure

  17. KLM - Worked Example(adapted from Browne, 1994) • Operation Operator Time • Decide to deal M 1.35 • System response, Windows R 2.00 • Locate and grasp mouse H 0.40 • Point at option P 1.10 • Press mouse button B 0.10 • Release mouse button B 0.10 • Identify source (customer) M 1.35 • Point at source on menu P 1.10 • Press mouse button B 0.10 • System response, window R 1.00 • Release mouse button B 0.10 • Ascertain product M 1.35 • Press mouse button B 0.10 • Release mouse button B 0.10 • System response, window R 1.00 • Calculate quote (complex) M 2.70 • Return hand to keyboard H 0.40 • Type 6 quote characters K * 6 1.20 • Total 15.55

  18. Analytic Evaluation:Cognitive Walkthrough • Analyses design in terms of exploratory learning i.e. user • Has rough plan • Explores system for possible actions • Select apparently most appropriate action • Interpret system’s response and assess if progresses task • Suits systems primarily learned by exploration e.g. walk-up-and-use • Overall question - How successfully does this design guide the unfamiliar user through the performance of the task?

  19. Analytic Evaluation:Cognitive Walkthrough - Key Questions Simulation of exploration, selection and interpretation at each state of interaction • Will the correct action be made sufficiently evident to the user? • Will the user connect the correct action’s description with what he or she is trying to do? • Will the user interpret the system’s response to the chosen action correctly, that is, will the user know if he or she has made a right or a wrong choice?

  20. Lecture Review • Definition and motivation • Industrial practice and interest • Types of evaluation • Analytic methods • Heuristic evaluation • Keystroke level model • Cognitive walkthrough

More Related