Evaluation in hci
Download
1 / 40

Evaluation in HCI - PowerPoint PPT Presentation


  • 198 Views
  • Uploaded on

Evaluation in HCI. Angela Kessell Oct. 13, 2005. Evaluation. Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral and Social Sciences. Evaluation. Heuristic Evaluation “Discount usability engineering method” Measuring API Usability

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Evaluation in HCI' - andrew


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Evaluation in hci

Evaluation in HCI

Angela Kessell

Oct. 13, 2005


Evaluation
Evaluation

  • Heuristic Evaluation

  • Measuring API Usability

  • Methodology Matters: Doing Research in the Behavioral and Social Sciences


Evaluation1
Evaluation

  • Heuristic Evaluation

    • “Discount usability engineering method”

  • Measuring API Usability

  • Methodology Matters: Doing Research in the Behavioral and Social Sciences


Evaluation2
Evaluation

  • Heuristic Evaluation

    • “Discount usability engineering method”

  • Measuring API Usability

    • Usability applied to APIs

  • Methodology Matters: Doing Research in the Behavioral and Social Sciences


Evaluation3
Evaluation

  • Heuristic Evaluation

    • “Discount usability engineering method”

  • Measuring API Usability

    • Usability applied to APIs

  • Methodology Matters: Doing Research in the Behavioral and Social Sciences

    • Designing, carrying out, and evaluating human subjects studies


Heuristic evaluation jakob nielsen
Heuristic EvaluationJakob Nielsen


Heuristic evaluation jakob nielsen1
Heuristic EvaluationJakob Nielsen

Most usability engineering methods will contribute substantially to the usability of an interface …


Heuristic evaluation jakob nielsen2
Heuristic EvaluationJakob Nielsen

Most usability engineering methods will contribute substantially to the usability of an interface …

…if they are actually used.



Heuristic evaluation1
Heuristic Evaluation

  • What is it?

    A discount usability engineering method


Heuristic evaluation2
Heuristic Evaluation

  • What is it?

    A discount usability engineering method

    - Easy (can be taught in ½ day seminar)

    - Fast (about a day for most evaluations)

    - Cheap (e.g. $(4,000 + 600i))


Heuristic evaluation3
Heuristic Evaluation

  • How does it work?


Heuristic evaluation4
Heuristic Evaluation

  • How does it work?

    • Evaluators use a checklist of basic usability heuristics

    • Evaluators go through an interface twice

      • 1st pass get a feel for the flow and general scope

      • 2nd pass refer to checklist of usability heuristics and focus on individual elements

    • The findings of evaluators are combined and assessed


Heuristic evaluation usability heuristics original unrevised list
Heuristic EvaluationUsability Heuristics (original, unrevised list)

  • Simple and natural dialogue

  • Speak the users’ language

  • Minimize the users’ memory load

  • Consistency

  • Feedback

  • Clearly marked exits

  • Shortcuts

  • Precise and constructive error messages

  • Prevent errors

  • Help and documentation


Heuristic evaluation usability heuristics original unrevised list1
Heuristic EvaluationUsability Heuristics (original, unrevised list)

  • Simple and natural dialogue

  • Speak the users’ language

  • Minimize the users’ memory load

  • Consistency

  • Feedback

  • Clearly marked exits

  • Shortcuts

  • Precise and constructive error messages

  • Prevent errors

  • Help and documentation

COMMENTS?


Heuristic evaluation5
Heuristic Evaluation

  • One expert won’t due

  • Need 3 - 5 evaluators

  • Exact number needed depends on cost-benefit analysis


Heuristic evaluation6
Heuristic Evaluation

  • Who are these evaluators?

    • Typically not domain experts / real users

    • No official “usability specialist” certification exists

  • Optimal performance requires double experts


Heuristic evaluation7
Heuristic Evaluation

  • Debriefing session

    • Conducted in brain-storming mode

    • Evaluators rate the severity of all problems identified

    • Use a 0 – 4, absolute scale

      • 0 I don’t agree that this is a prob at all

      • 1 Cosmetic prob only

      • 2 Minor prob – low priority

      • 3 Major prob – high priority

      • 4 Usability catastrophe – imperative to fix


Heuristic evaluation8
Heuristic Evaluation

  • Debriefing session

    • Conducted in brain-storming mode

    • Evaluators rate the severity of all problems identified

    • Use a 0 – 4, absolute scale

      • 0 I don’t agree that this is a prob at all

      • 1 Cosmetic prob only

      • 2 Minor prob – low priority

      • 3 Major prob – high priority

      • 4 Usability catastrophe – imperative to fix

COMMENTS?


Heuristic evaluation9
Heuristic Evaluation

  • How does H.E. differ from User Testing?


Heuristic evaluation10
Heuristic Evaluation

  • How does H.E. differ from User Testing?

    • Evaluators have checklists

    • Evaluators are not the target users

    • Evaluators decide on their own how they want to proceed

    • Observer can answer evaluators’ questions about the domain or give hints for using the interface

    • Evaluators say what they didn’t like and why; observer doesn’t interpret evaluators’ actions


Heuristic evaluation11
Heuristic Evaluation

  • What are the shortcomings of H.E.?


Heuristic evaluation12
Heuristic Evaluation

  • What are the shortcomings of H.E.?

    • Identifies usability problems without indicating how they are to be fixed.

      • “Ideas for appropriate redesigns have to appear magically in the heads of designers on the basis of their sheer creative powers.”

    • Cannot expect it to address all usability issues when evaluators are not domain experts / actual users



Measuring api usability steven clarke1
Measuring API UsabilitySteven Clarke

  • User-centered design approach

    • Understanding both your users and the way they work

  • Scenario-based design approach

    • Ensures API reflects the tasks that users want to perform

  • Use Cognitive Dimensions Framework


Measuring api usability

Cognitive dimensions framework describes:

What users expect

What the API actually provides

Cognitive dimensions framework provides:

A common vocabulary for developers

Draws attention to important aspects

The Dimensions:

Abstraction level

Learning style

Working framework

Work-step unit

Progressive evaluation

Premature commitment

Penetrability

API elaboration

API viscosity

Consistency

Role expressiveness

Domain correspondence

Measuring API Usability


Measuring api usability1

Cognitive dimensions framework describes:

What users expect

What the API actually provides

Cognitive dimensions framework provides:

A common vocabulary for developers

Draws attention to important aspects

The Dimensions:

Abstraction level

Learning style

Working framework

Work-step unit

Progressive evaluation

Premature commitment

Penetrability

API elaboration

API viscosity

Consistency

Role expressiveness

Domain correspondence

Measuring API Usability

COMMENTS?


Measuring api usability2
Measuring API Usability

  • Use Personas:

    • Profiles describing the stereotypical behavior of three main developer groups (Opportunistic, Pragmatic, Systematic)

  • Compare API evaluation with the profile requirements


Measuring api usability3
Measuring API Usability

  • Use Personas:

    • Profiles describing the stereotypical behavior of three main developer groups (Opportunistic, Pragmatic, Systematic)

  • Compare API evaluation with the profile requirements

COMMENTS?



Methodology matters doing research in the behavioral and social sciences
Methodology Matters: Doing Research in the Behavioral and Social Sciences

Key points:

  • All methods are valuable, but all have limitations/weaknesses

  • Offset the weaknesses by using multiple methods


Methodology matters doing research in the behavioral and social sciences1
Methodology Matters: Doing Research in the Behavioral and Social Sciences

In conducting research, try to maximize:

  • Generalizability

  • Precision

  • Realism


Methodology matters doing research in the behavioral and social sciences2
Methodology Matters: Doing Research in the Behavioral and Social Sciences

In conducting research, try to maximize:

  • Generalizability

  • Precision

  • Realism

    -You cannot maximize all three simultaneously.


Methodology matters doing research in the behavioral and social sciences3
Methodology Matters: Doing Research in the Behavioral and Social Sciences

From http://pages.cpsc.ucalgary.ca/%7Esaul/hci_educ_papers/bgbg95/mcgrath-summary.pdf


So… Social Sciences

  • 1st 2 papers focus on computer programs / GUIs

  • 3rd paper presents the whole gamut of methodologies available to study any human behavior


But what s missing
But… what’s missing? Social Sciences


But… Social Sciences

  • Where are the statistics?

  • Are there objective “right” answers in HCI?

  • How do we evaluate other kinds of interfaces?

  • Other thoughts on what’s missing?


How do we evaluate
How do we evaluate… Social Sciences

  • “Embodied virtuality” / ubiquitous computing “interfaces”

  • (Aura video… http://www.cs.cmu.edu/~aura/)

  • Try to pick out one capability presented, and think about how you might evaluate it


Evaluating aura
Evaluating Aura Social Sciences

  • Do we evaluate the whole system at once? Or bit by bit?

  • Where / What is the interface?

  • Is anyone not a target user?



ad