human computer interaction n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Human-Computer Interaction PowerPoint Presentation
Download Presentation
Human-Computer Interaction

Loading in 2 Seconds...

play fullscreen
1 / 20

Human-Computer Interaction - PowerPoint PPT Presentation


  • 79 Views
  • Uploaded on

Human-Computer Interaction. Usability Evaluation: 1 Introduction and Analytic Methods. Lecture Overview. Definition and motivation Industrial practice and interest Types of evaluation Analytic methods Heuristic evaluation Keystroke level model Cognitive walkthrough.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Human-Computer Interaction' - dunne


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
human computer interaction
Human-Computer Interaction

Usability Evaluation: 1

Introduction and Analytic Methods

lecture overview
Lecture Overview
  • Definition and motivation
  • Industrial practice and interest
  • Types of evaluation
  • Analytic methods
    • Heuristic evaluation
    • Keystroke level model
    • Cognitive walkthrough
evaluation definition and motivation
Evaluation: Definition and Motivation
  • Definition
  • Gathering information about usability or potential usability of a system
  • Motivation
  • Suggest improvements or confirm acceptability of interface and/or supporting materials
  • Ensure competitive reputation
    • ‘Users will evaluate your interface sooner or later’ (Hix and Hartson, 1993)
  • Match or exceed usability of competitor’s products (and statutory requirements)
music project metrics for usability standards in computing
MUSiC Project - Metrics for Usability Standards in Computing
  • Early 1990’s - European survey
  • Generally high appreciation of importance of usability evaluation
  • Knowledge of evaluation methods limited
  • Lack of metrics a major problem (after time and money limitations)
  • Intuitiveness of a product and ability to learn it quickly without manuals is a an increasingly important usability factor
reasons why interface evaluation is often omitted or poorly performed
Reasons why Interface Evaluation is Often Omitted or Poorly Performed
  • Assumption designer’s personal behaviour is ‘representative’
  • Implicit unsupported assumptions about human performance
  • Acceptance of traditional/standard interface design
  • Postponement of evaluation until ‘a more convenient time’
  • Lack of expertise in analysing experiments
what to evaluate
What to Evaluate
  • Usability specifications at all lifecycle stages
  • Initial designs (pre-implementation)
    • Partial
    • Integrated
  • Prototype at various stages
  • Final(?) implementation
  • Documentation
formative evaluation
Formative Evaluation
  • Repeatedly, as development proceeds
  • Purpose: to support iterative refinement
  • Nature: structured, but fairly informal
  • Average of 3 major ‘design-test-redesign’ cycles, with many minor cycles to check minor changes

The earlier poor design features or errors are detected, the easier and cheaper they are to correct

summative evaluation
Summative Evaluation
  • Once, after implementation (or nearly so)
  • Important in field or ‘beta’ testing
  • Purpose: quality control - product is reviewed to check it meets
    • Own specifications
    • Prescribed standards, e.g. Health and Safety, ISO
  • Nature: formal, often involving statistical inferences
where to evaluate
Where to Evaluate
  • Designer’s mind
  • Discussion workshops
  • Representative workplace
  • Experimental laboratory
how to evaluate evaluation methods
How to Evaluate:Evaluation Methods

Method Interface User Involvement

development

Analytic Specification No users

Expert Specification or No users

prototype Role playing only

Observational Simulation or Real users

prototype

Survey Simulation or Real users

prototype

Experimental Normally full Real users

prototype

costs

Empirical

types of data
Types of Data
  • Quantitative data
    • Objective measures
      • Directly observed
      • E.g. time to complete, accuracy of recall
    • User performances or attitudes can be recorded in a numerical form
  • Qualitative data
    • Subjective responses
    • Reports and opinions that may be categorized in some way but not reduced to numerical values
measurement tools
Measurement Tools
  • Semi-structured interview
  • Questionnaire - personal /postal administration
  • Incident diary
  • Feature checklist
  • Focus group
  • Think-aloud
  • Interactive experiment

Compare on:

    • Cost
    • Number of subjects
analytic evaluation
Analytic Evaluation

Advantages Disadvantages

  • Usable early in design
  • Little or no advance planning
  • Cheap
  • Quick
  • Focus on problems
  • Lack of diagnostic output for redesign
  • Encourages strengthening of existing solution
  • Broad assumptions of users’ cognition
  • Can be difficult for evaluator
analytic evaluation heuristic evaluation
Analytic Evaluation:Heuristic Evaluation
  • Assess design against known usability criteria
  • e.g. Brown, 1994
      • Coffee break test
      • Data overload
      • Engineering model
      • Help
      • Mode test
      • Dead ends
      • Unable to complete
      • Stupid questions
  • Jotting test
  • Standalone test
  • Maintenance test
  • Consistency test
  • Reversibility
  • Functionality test
  • Knowledge of completion
analytic evaluation keystroke level model card et al 1980
Analytic Evaluation:Keystroke Level Model (Card et al., 1980)
  • Best known analytic evaluation technique
  • Simple way of analysing expert user performance - usually of unit tasks - say 20 secs
  • Applies constants to operations - total gives completion time for error free dialogue sequence
  • Proven predictive validity (+ 20%)
    • Human motor system is well understood
    • No high-level mental activity
klm constants
KLM Constants

Averages - modify to suit

  • Operator Meaning Time(secs)
    • K Press key (good) 0.12 0.28 1.20 (poor)
    • B Mouse button press
          • Down or up 0.10
          • Click 0.20
    • P Point with mouse 1.10 (Fitt’s law: K log2(Distance/Size + 0.5)
    • H Hand to keyboard or mouse 0.40
    • M Mental preparation for physical 1.35 action - 1 M per ‘chunk’
    • R System response time Measure
klm worked example adapted from browne 1994
KLM - Worked Example(adapted from Browne, 1994)
  • Operation Operator Time
  • Decide to deal M 1.35
  • System response, Windows R 2.00
  • Locate and grasp mouse H 0.40
  • Point at option P 1.10
  • Press mouse button B 0.10
  • Release mouse button B 0.10
  • Identify source (customer) M 1.35
  • Point at source on menu P 1.10
  • Press mouse button B 0.10
  • System response, window R 1.00
  • Release mouse button B 0.10
  • Ascertain product M 1.35
  • Press mouse button B 0.10
  • Release mouse button B 0.10
  • System response, window R 1.00
  • Calculate quote (complex) M 2.70
  • Return hand to keyboard H 0.40
  • Type 6 quote characters K * 6 1.20
          • Total 15.55
analytic evaluation cognitive walkthrough
Analytic Evaluation:Cognitive Walkthrough
  • Analyses design in terms of exploratory learning i.e. user
    • Has rough plan
    • Explores system for possible actions
    • Select apparently most appropriate action
    • Interpret system’s response and assess if progresses task
  • Suits systems primarily learned by exploration e.g. walk-up-and-use
  • Overall question - How successfully does this design guide the unfamiliar user through the performance of the task?
analytic evaluation cognitive walkthrough key questions
Analytic Evaluation:Cognitive Walkthrough - Key Questions

Simulation of exploration, selection and interpretation at each state of interaction

  • Will the correct action be made sufficiently evident to the user?
  • Will the user connect the correct action’s description with what he or she is trying to do?
  • Will the user interpret the system’s response to the chosen action correctly, that is, will the user know if he or she has made a right or a wrong choice?
lecture review
Lecture Review
  • Definition and motivation
  • Industrial practice and interest
  • Types of evaluation
  • Analytic methods
    • Heuristic evaluation
    • Keystroke level model
    • Cognitive walkthrough