1 / 83

Usability with Project Lecture 10 – 19/03/10

Usability with Project Lecture 10 – 19/03/10. Dr. Simeon Keates. Exercise – part 1. Consider sending an SMS or e-mail Look at one of your mobile phones … And a laptop, etc. Perform exclusion calculations on each product using the data on: http://www.eng.cam.ac.uk/inclusivedesign/.

sahara
Download Presentation

Usability with Project Lecture 10 – 19/03/10

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Usability with ProjectLecture 10 – 19/03/10 Dr. Simeon Keates

  2. Exercise – part 1 • Consider sending an SMS or e-mail • Look at one of your mobile phones … • And a laptop, etc. • Perform exclusion calculations on each product using the data on: • http://www.eng.cam.ac.uk/inclusivedesign/

  3. Exercise – part 2 • Identify the common methods of interacting with the product • Identify which of the 7 DFS capability scales are involved in the interaction • Based on the DFS scales, estimate the limiting capability demand for each scale

  4. Exercise – part 3 • Report the number and %age of people excluded by each capability demand • For 16+ and 75+ • Report the total number and %age of people excluded by the product • For 16+ and 75+ • Prepare a 5 minute presentation to discuss: • Your exclusion calculation assumptions • Your exclusion calculation results • What were the principal causes of exclusion? • What do you think should be done to reduce the exclusion for each product?

  5. Assessment of DTT STBs...

  6. Service provider Remote control User Remote control Set top box (STB) A systems overview Television

  7. Motivations for study • Commissioned by UK Department of Trade and Industry • Wanted to find out who could not access DTT • Original focus on ‘the disabled’ • Definition broadened... Why?

  8. Typical assessment methods used in this research • Expert assessment • Exclusion analysis • User observation • Questionnaires • Interviews • Focus Groups Assessment of STBs Customer expectations

  9. Methodology - Choice of STBs 2 STBs chosen for study • STB1 - marketed as “easy to use” • STB2 - market leader 1 digital satellite system chose as comparison • STB3– developed by content provider

  10. Expert assessment • 4 assessors • 2 with DTV experience, 2 without • STB protocol only • Aims to identify most likely sources of problems • Define protocol for following assessments

  11. Methodology - Analogue TV protocol • 6 activities • Switch on • Change channel • Change volume • Teletext (find local weather) • Subtitles (on/off) • Switch off

  12. Methodology - STB protocol • 8 activities • Installation • Switch on • Change channels (direct + EPG) • Change volume • Teletext (find local weather) • Subtitles (on/off) • BBCi (find local weather) • Switch off

  13. Expert Assessment - Results 13 major sources of difficulty found • 4 – Installation and set-up • e.g. instruction manual, initial tuning • 5 – Operation • e.g. multiple modes, subtitles • 4 – Remote controls • e.g. labelling, layout

  14. Exclusion analysis • Systematic analysis • Combined with data from Office of National Statistics • Population data 1996/7 UK Disability Follow-Up Survey • Aims to calculate how many people have the difficulties highlighted by expert assessment • How many people in the user observation should have those difficulties?

  15. Example – breakdown of installation

  16. Exclusion analysis - Results 16+ 75+ 25 20 Population (,000s) 15 10 5 0 Analogue DTV Analogue DTV

  17. Exclusion analysis - Results 16+ 75+ 25 25 20 20 Population (,000s) Population (%age) 15 15 10 10 5 5 0 0 Analogue DTV Analogue DTV

  18. User observations - Overview • 13 users - 12 aged 60+, 1 aged 24 • 9 no DTV experience • 2 owned STBs • 1 owned satellite box • 1 owned iDTV • 7 PC users, 6 non-users • All ‘independent’ living

  19. User observations - Methodology 2 hour sessions comprising: • 30 minutes briefing • 60-75 minutes with equipment • 15-20 minutes analogue • 40-60 minutes DTV • 2 STBs • 15 minutes debriefing

  20. User observations - Set-up

  21. User observations - Example visual problems • Finding buttons on r/c • Especially POWER • Switching between r/c and screen • Different pairs of glasses • Reading on-screen font • No zoom facility • Reading instruction manual • Small print New difficulty

  22. User observations - Example motor problems • Pressing buttons on r/c • Size and shape • Time-outs • e.g. on EPG (040 -> 004) • Arrow button overshoot • Oscillating cursor New difficulty New difficulty

  23. STB2 remote control

  24. STB1 remote control

  25. User observations - Example ‘cognitive’ problems • Use of OK/SELECT • Inconsistent language (OK=SELECT?) • Which r/c to use / which mode am I in? • How to start/navigate BBCi/Teletext? • How to call up/navigate on-screen menus? • How to operate/navigate the EPG? • Inconsistent layout • e.g. LHS on screen, RHS on r/c

  26. User observations - Summary of results

  27. Summary • Cognitive/experience issues most important • Many of the problems easily avoidable

  28. Classic case of “designers designing for themselves” Origins of the problems for older users • New language / terminology • Jargon • New input paradigms • Part TV, part PC • New interaction concept • Interacting with STB, not TV • Inadequate explanation

  29. Implications of prevalence of cognitive difficulties • What does this mean for assessment methods? • Single assessment methods vs. multiple? • In what order should they be used? • What does this mean for designers? • How to design for different experience?

  30. What is “reasonable accommodation”?

  31. Defining “reasonable accommodation” • Must offer “reasonable accommodation” • BUT what is reasonable? • Not defined explicitly • Companies left guessing • Will be defined in courts • Major risk/headache for companies

  32. IDEOLOGICAL DIVIDE Attitudes to “reasonable accommodation” EQUITABLE ACCESS EQUITABLE ACCESS MINIMUM (compliance) Access to functionality MINIMUM (compliance) Access to functionality IDEAL Access to functionality in same time IDEAL Access to functionality in same time Pragmatists Idealists

  33. Interesting questions for companies • Is the equitable access ideal possible? • Is the equitable access minimum possible? • “ Equal, but different ” problem • Users with functional impairments => longer times • Can technology always make up the difference in user capabilities? 3 case studies…

  34. Case study 1: The personal information point

  35. The information point accessibility assessment Sensory assessment: • Screen too high and not adjustable • Audio output not duplicated • Visual output not duplicated Motor assessment: • Need to stand • Reaching and dexterity demands • 53%of target users excluded Is this “reasonable”?

  36. Case study 2 – Cursor assistance for motor-impaired users Symptoms that can affect cursor control: • Tremor • Spasm • Restricted motion • Reduced strength • Poor co-ordination

  37. User group behaviours Peak velocities Target activation times No. of incorrect clicks

  38. Summarising the differences • Younger adults (IBM interns) • Shortest (1), fastest (1), more errors (3) - slapdash • “I can fix it” • Games culture? • Adults (IBM regulars) • Shorter (2), faster (2), fewest errors (1) • Best compromise between speed and accuracy? • Parkinson’s users • Longer (3), slowest (4), fewer errors (2) • Slow, but sure • Older adults • Longest (4), slower (3), most errors (4) • Vision difficulties? • Lack of experience

  39. A method of cursor assistance • Haptic gravity wells: Gravity well Attractive force Target

  40. Experimental set-up

  41. The effect of gravity wells Target

  42. Motor impairment in practice…

  43. Results - Throughput

  44. Case study 2 summary • Haptic gravity wells are clearly very helpful • MI users “with” on similar level to AB users “without” BUT: • AB users also improve “with” • Is this “equal” time? • Is this “reasonable”???

  45. Case study 3 – Paperless office • AN Other wants to move to a paperless office • Currently receives 3.5 million pages per day • Paper documents are stored as TIFFs • Section 508 accessibility requirements • Sight-impaired • Low vision • Current solution – employ readers • “ Equal, but different. ” • Is this reasonable?

  46. The study documents • Almost fully unconstrained • Content: • Unconstrained vocabulary • Text: • Typed • Handwritten • Annotated • Stamps • Graphical content: • Diagrams • Charts • Graphs

  47. Examples of the study documents

  48. Examples of the study documents (cont.)

  49. Examples of the study documents (cont.)

  50. Readability metrics (text) • Translation rates: • Character-by-character • Word-by-word • Number and %ages of errors: • Level 1 - Minor • Level 2 - Moderate • Level 3 - Serious

More Related