Universal Design - PowerPoint PPT Presentation

betty_james
universal design n.
Skip this Video
Loading SlideShow in 5 Seconds..
Universal Design PowerPoint Presentation
Download Presentation
Universal Design

play fullscreen
1 / 44
Download Presentation
Universal Design
572 Views
Download Presentation

Universal Design

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Universal Design Material from Authors of Human Computer Interaction Alan Dix, et al

  2. Overview • Universal design is about designing systems so they can be used by anyone in any circumstance. • Multi-modal systems use more than one human input channel. • Speech & non-speech sound • touch • handwriting • gestures • Universal design means designing for diversity • people with sensory, physical or cognitive impairment • people of different ages • people from different cultures or backgrounds

  3. Universal Design • Practical? • May not be able to design everything to be accessible to everyone so they have the same experience, but we try to provide equivalent experience. • Does not have to be complex or costly • Many examples take into account diversity • lowered curb with different texture at intersections • help people in wheelchairs, blind • mothers pushing carriages, people lugging suitcases ...

  4. Seven Universal Design Principles 1. Equitable use - if identical use is not possible. • safety, security and privacy should be available to all. 2. Flexibility in use • provide choice of methods & adapt to user’s pace 3. Simplicity and intuitiveness of use • support user’s expectations • accommodate different languages and literacy skills • provide prompting and feedback

  5. Seven Universal Design Principles 4. Perceptible information • redundancy of information: use different forms/modes • emphasize essential information. 5. Tolerance for error • minimize impact caused by mistakes • remove potentially dangerous situations or hard to reach • hazards should be shielded by warnings.

  6. Seven Universal Design Principles 6. Low physical effort • comfort; minimize fatigue and effort; • repetitive or sustained actions should be avoided 7. Size and space for approach and use • placement of system should be reachable by all users • consider line of sight for standing and sitting user • allow for variation in hand size • provide room for assistive devices • Principles 6 and 7 apply less to software

  7. Multi-modal Interaction • Provides access to information through more than one mode of interaction • Sight is predominant and most interactive systems use visual channel as primary presentation • graphics • text, • video • animation

  8. Multi-modal interaction • Sound important • keeps us aware of surroundings • provides clues and cues to switch our attention • music - also auditory • convey and alter moods • conjure up visual images • evoke atmospheres • Touch • tactile feedback to operate tools • hold and move tools, instruments, pens

  9. Multi-modal interaction • Taste and smell • less appreciated • check food if bad, detect early signs of fire, …

  10. Multi-modal interaction • Human-human everyday interaction multi-modal • Each sense provides different information to make whole • Want Human-computer interaction to be multi-modal • visual channel can get overloaded • provide richer interaction • provide redundancy for an equivalent experience to all

  11. Sound in the interface • Contributes to usability • Audio confirmation • changes in key clicks • error occurrences • Provide information when visual attention elsewhere • …or environment has visual limitations • Dual presentation through sound and vision supports universal design • enables access to visual and hearing impaired • Two kinds: speech and nonspeech

  12. Sound in the interface: Speech • Language complex • structure • pronunciation • phonemes - atomic elements of speech (40 in English) • prosody - alteration in tone and quality • co-articulation - phonemes sound different next to others • allophones - differences in sound in phonemes • morphemes - smallest unit of language that has meaning • grammar

  13. Sound in the interface: Speech • Speech recognition • Useful when hands are occupied • Alternative means of input for users with visual, physical and cognitive impairment • single-user systems; require training • barriers • background noise • redundant and meaningless noise (‘uh’) • variations between individuals and regional accents • Examples • speech-based word processors • telephone -based systems • interactive systems that give feedback

  14. Sound in the interface: Speech • Speech Synthesis • Complementary to speech recognition • Problems • monotonic - doesn’t sound natural • canned messages - not too bad, prosody can be hand coded • spoken output cannot be reviewed or browsed easily • intrusive (more noise or equipment) • Application areas • blind or partially sighted • accessible output medium (screen readers) • assist those with disabilities affecting their speech • predefined messages can be stored

  15. Sound in the interface: Speech • Un-interpreted speech • Speech does not have to recognized by computer to be useful • Examples: • Fixed pre-recorded messages • human prosody and pronunciation • quality is low • example: announcements in airport • Voice mail • Audio annotations • Can be digitally sped up without changing pitch

  16. Sound in the interface: Non-speech sound • Assimilated quickly • Learned regardless of language • Require less attention • Uses: • indications of changes or errors in interactive system • provide status changes • sound representation of actions and objects • provide confirmation • give redundant information • Two Kinds - auditory icons and earcons

  17. Sound in the interface: Non-speech sound • Auditory icons • Use natural sounds to represent types of objects and actions • Example: Mac’s SonicFinder • crumpling paper when putting file in wastebasket • Problem: Some objects or actions don’t have a natural sound

  18. Sound in the interface: Non-speech sound • Earcons • use structure- combinations of notes (motives) to represent actions and objects • vary according to rhythm, pitch, timbre, scale and volume • hierarchically structured • compound earcons - combine motives • ‘create’ and ‘file’ • family earcons - ‘error’ family • makes learning easier • even lack of musical ability has little effect on ability to remember earcons

  19. Touch in the interface • Touch both sends and receives information • Touch in the interface is haptic interaction • Two areas: • cutaneous - tactile sensations through skin • vibrations against skin; temperature, texture • kinesthetics - perception of movement and position • resistance or force feedback • Entertainment or training • Tactile devices • electronic braille display • force feedback devices in VR equipment

  20. Handwriting Recognition • Handwriting provides textural and graphical input • Technology for recognition • digitizing tablet • sampling problems • electronic paper - thin screen on top • Recognizing handwriting • variation among individuals (even day-to-day) • co-articulation - letters are different next to others • cursive more difficult

  21. Gesture recognition • Subject in multi-modal systems recently • Involves controlling computer with movements • Put that there • Good situations • no possibility for typing (VR) • supports people with hearing loss (sign language • Technology expensive • computer vision • data glove (intrusive)

  22. Gesture recognition • Problems • Gestures user dependent • variation • co-articulation • segmenting gestures difficult

  23. Designing for Diversity • Interfaces usually designed for ‘average’ user • Universal design indicate we take into account many factors (focus on 3) • disability • age • culture

  24. Designing for users with disabilities • 10% population has disability that will affect interaction with computers • Moral and legal responsibility to provide accessible products • Look at following kinds of impairments • sensory • physical • cognitive

  25. Visual impairment • Screen readers using synthesized speech or braille output devices can provide complete access to text-based interactive applications. • Ironically rise in use of graphical interfaces reduces possibilities for visually impaired users. • To extend access use • sound • touch

  26. Visual impairment • Sound • speech • earcons and auditory icons to graphical objects • Example 1: Outspoken • Macintosh application • uses synthetic speech to make other Mac applications available to visually impaired users

  27. Visual impairment • More recent is use of touch in the interface • Tactile interaction • electronic braille displays • force feedback devices • elements in interface can be touched • edges, textures and behavior (pushing a button) • requires specialist software • more likely major applications will become ‘haptic enabled’ in future

  28. Hearing impairment • Hearing impairment may appear to have little impact on use of an interface (or a graphical interface) • To an extent true (but increase in multi-media applications)

  29. Hearing impairment • Computer technology can enhance communication opportunities for people with hearing loss • email and instant messaging • gesture recognition to translate signing or speech • caption audio content • Also enhances experiences of all users - good universal design

  30. Physical impairment • Users with physical disabilities vary in amount of control and movement they have in hands • Precise mouse control may be difficult • Speech input and output is an option (if they can speak without difficulty)

  31. Physical impairment • Alternatives • eyegaze system - tracks eye movements to control cursor • keyboard driver - attaches to user’s head • gesture and movement tracking • predictive systems (Reactive keyboard) can anticipate commands within context

  32. Speech impairment • Multimedia systems provide a number of tools for communication • text-based communication and conferencing systems (slow) • synthetic speech • can be pre-programmed • predictive algorithms • anticipate words and fill them in • conventions can help provide context • smiley face :) for a joke

  33. Dyslexia • Textual information is difficult for dyslexic users • More severe forms • idiosyncratic word construction methods • spell phonetically • Speech input and output devices can alleviate need to read and write • Less severe forms • spell correction facilities • Consistent navigation structure and clear sign posting cues are important • Use color coding and graphical information

  34. Autism • Affects person’s ability to communicate and interact with people and make sense of environment • Triad of impairments • Social interaction - relating to others and responding appropriately to social situation • Communication - problems in understanding verbal and textual language (including gestures and expressions) • Imagination - rigidity of thought processes

  35. Autism • Universal design can assist in two main areas: • Communication • computers are motivating (consistent and impersonal) • problems with language may be aided by graphical representations of information • Education • enables autistic person to experience (VR and games) social situations and learn appropriate responses • provides a secure and consistent environment where they are in control

  36. Designing for different age groups • Older people and children have specific needs when it comes to interactive technology • Older people • proportion growing • have more leisure time and disposable income • no evidence they are averse to new technologies

  37. Designing for different age groups: Older people • Requirements: • proportion of disabilities increases with age • over 50% over age 65 have one • failing vision, hearing, speech, mobility • age-related memory loss • some older users lack familiarity and fear learning • New tools • email and instant message provide social interaction in cases of mobility or speech difficulties • mobile technologies provide memory aids

  38. Designing for different age groups: Older people • Manuals and terminology difficult, so use redundancy and support user of access • Designs must be clear and simple and forgiving of mistakes • Sympathetic and relevant training

  39. Designing for different age groups: Children • Children have specific needs and they are diverse • different ages • have own goals and likes and dislikes • Involve children in design of interactive design (intergenerational design teams) • May not have developed hand-eye coordination and makes keyboards difficult • pen-based interfaces • multiple modes of input involving touch and handwriting • redundant displays

  40. Designing for cultural differences • National • Age • Gender • Race • Sexuality • Class • Religion • Political Persuasion All influence individual’s response to a system, but may not be relevant in design of a given system

  41. Designing for cultural differences • Key factors to consider • language • cultural symbols • gestures • use of color

  42. Designing for cultural differences • Language • Toolkits for designing systems provide language resource databases to translate menu items, text, error messages, etc. • Layouts for languages that don’t read the same are a problem • left to right vs top to bottom • Symbols have different meaning • ticks and crosses - interchangeable in some cultures • rainbow - covenant with God, diversity, hope and peace

  43. Designing for cultural differences • Use of gestures • common in video and animation • more common in virtual reality and avatars in games • Color • red for danger • red represents life (India), happiness (China) and royalty (France) • difficult to assume universal interpretation of color • support and clarify color with redundancy