1 / 27

Interaction

Interaction. Gavin Sim. Aims of this lecture. Last week focused on persona and scenario creation. This weeks aims are: To introduce Interaction as a concept that links the human and the computer To begin to consider designing interaction. Inter - Action.

jorden-kent
Download Presentation

Interaction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Interaction Gavin Sim HCI Lecture 4 - 2010/11

  2. Aims of this lecture • Last week focused on persona and scenario creation. • This weeks aims are: • To introduce Interaction as a concept that links the human and the computer • To begin to consider designing interaction HCI Lecture 4 - 2010/11

  3. Inter - Action • A system is INTERACTIVE if a human user acts with the system in such a way that the system responds in an ACTIVE way depending on the ACT of the human user HCI Lecture 4 - 2010/11

  4. Humans INTERACT with computers in different ways • Touch • Key presses • Touch gestures • Voice • Speech recognition • Actions • Body recognition Increasing complexity HCI Lecture 4 - 2010/11

  5. Multi-Modal Interaction • Is where the user can interact using more than one mode – a classic example is the combination of speech and touch – key pressing and talking at the same time • Example • Designing multi-modal interaction is difficult as there are ‘mode’ errors HCI Lecture 4 - 2010/11

  6. Mode Errors • Where the computer (and the user) are not sure what ‘mode’ a system is in • Example – the pen as a writing device in a tablet system and the pen as a pointing device (opening up menus) in the same system • Save in speech recognition whilst doing word processing HCI Lecture 4 - 2010/11

  7. Interaction is complex • Norman’s model • Goals: What we want to happen • Execution: Execute action in the world • World: Manipulate objects • Evaluate: validate action and compare results with our goal Goal Execution Evaluation World HCI Lecture 4 - 2010/11

  8. Norman’s model • Execution • Interaction • Task sequence • Physical Action • Evaluation • See • Evaluate • Check HCI Lecture 4 - 2010/11

  9. Execution • Forming the Intention • Goals must be transformed into intentions, i.e.,  specific statements of what has to be done to satisfy the goal.  E.g., "Make a cup of tea using a Tetley tea bag." • Specifying an Action Sequence • What is to be done to the World.  The precise sequence of operators that must be performed to effect the intention.  E.g., “Boil the kettle....." • Executing an Action • Actually doing something.  Putting the action sequence into effect on the world.  E.g., actually boiling the kettle. HCI Lecture 4 - 2010/11

  10. Evaluation • Perceiving the State of the World • Perceiving what has actually happened.  E.g., the experience of taste of the tea. • Interpreting the State of the World • Trying to make sense of the perceptions available.  E.g., Putting those perceptions together to present the sensory experience of a cup of tea. • Evaluating the Outcome • Comparing what happened with what was wanted.  E.g., did the cup of tea match up to the requirement of 'a nice drink'? HCI Lecture 4 - 2010/11

  11. Gulf’s of Interaction • Norman talks about 2 • THE GULF OF EXECUTION:  does the system provide actions that correspond to the intentions of the user? • THE GULF OF EVALUATION:  does the system provide a physical representation that can be directly perceived and that is directly interpretable in terms of the intentions and expectations of the user? • We don’t know what to do • What we do doesn’t take us towards our goal • We don’t see any feedback • The feedback we get doesn’t tell us we are making progress HCI Lecture 4 - 2010/11

  12. Examples… HCI Lecture 4 - 2010/11

  13. Example 2 • The problem statement is clear, but the supplemental explanation is Martian.

  14. Interaction is about Goals • Understand the person => Understand the goals => Understand the interaction HCI Lecture 4 - 2010/11

  15. Keyboards and Keypads • Primary mode of text entry • Beginners 1 keystroke per second • Average office worker 5 strokes per second (50 words a minute) • Rapid data entry can be achieve if more than one key can be pressed simultaneously • Can represent entire words • Court rooms (300 words a minute)

  16. Interacting with computers • Interact with computers in a variety of different ways • Today we will focus on keyboard and pointing devices

  17. Keyboard Layout • QWERTY Keyboard Christopher Latham in the 1870’s to prevent keys getting jammed • Used letter pairs far apart thereby increasing finger travel distance • Keyboards on Computers are thus inefficient

  18. Keyboard Layout • DVORAK increase typing from 150 words per minute to 200 for advanced users plus reduce errors • ABCDE in alphabetical order, novices will find keys

  19. Fitts Law • Calculate effectiveness of interaction • Word Example

  20. Keys • Concave surface and matte finish reduce finger slip • Key presses requires 40- to 125- gram force and displacement from 1 to 4 millimeters • FORCE is important • Key pressed enough emits a light click. This tactile and audible feedback is important • Clicks on surface computing important as you do not have tactile feedback

  21. Keys mobile device • Some come with full QWERTY keyboard • Can reach input of 60 words per minute with both thumbs when auto corrects • Numerical keyboards • Multitap key pressed multiple times and pause • Predictive techniques T9 dictionary based • LetterWise uses probabilities of prefixes for example if type th probability e next letter MacKenzie, I. S., Kober, H., Smith, D., Jones, T., Skepner, E. (2001). LetterWise: Prefix-based disambiguation for mobile text input. Proceedings of the ACM Symposium on User Interface Software and Technology - UIST 2001, pp. 111-120. New York: ACM

  22. Pointing Devices • Useful for 7 types of interaction (Foley et al 1984) • Select - from a menu • Position - drag picture next to text • Orient - a picture, create motion • Path - create a curve • Quantify – specify numeric value e.g. volume in music • Gestures – indicate an action to perform • Text – enter, edit, move

  23. Pointing Devices • Grouped into: • Direct Control • Indirect Control • Direct Control of on screen surface such as touchscreen or stylus • Indirect Control away from the screen mouse, graphics tablets etc..

  24. Stylus • Drawing • Handwrite on touch sensitive device • Natural way to interact • Device needs training to recognise hand writing • Recently had Vision Objects collecting samples of hand writing

  25. iPhone vs Wii – text input • Which is faster for text input?

  26. iPhone vs Wii – text input • iPhone 18.5 wpm • Wii 9.2 wpm • However – error rates • 7.7% for the iPhone • 2.8% for Wii • Errors on the iPhone predominantly cause hitting the adjacent key

  27. Summary • Understanding how users interact is imporatnt • Explored different forms of intearction • Keyboard • Pointing

More Related