Lecture: Interaction Models and Evaluation

Lecture: Interaction Models and Evaluation PowerPoint PPT Presentation

  • Uploaded on
  • Presentation posted in: General

This lecture. Interaction tasks and techniquesPredictive modelsFitts LawDescriptive modelsCard, Mackinlay and RobertsonBuxton's 3-state modelUsing interaction techniques with gazeTwo handed input and ToolglassesPieCursorsReal World Props. . 2. Comparing input devices. Aggregating research o

Download Presentation

Lecture: Interaction Models and Evaluation

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

1. Lecture: Interaction Models and Evaluation Howell Istance UTA/DMU

2. This lecture Interaction tasks and techniques Predictive models Fitts Law Descriptive models Card, Mackinlay and Robertson Buxton’s 3-state model Using interaction techniques with gaze Two handed input and Toolglasses PieCursors Real World Props 2

3. Comparing input devices Aggregating research on interaction techniques and input device performance (benchmarking) requires standardised metrics and procedures for obtaining these An issue is the extent to which metrics reflect actual usability in-situ by diverse user groups 3

4. Reasoning about interaction We need to be clear about the boundaries between input devices and applications We want to build interaction techniques that are suited for input from an eye tracker We want to use gaze input with existing applications and their interfaces 4

5. 1. 5 Interaction tasks and techniques Foley (1990) – distinguishes ‘interaction task’ and ‘interaction technique’ Interaction techniques are ways to use input devices to enter information Interaction tasks classify different types of information entered with the interaction techniques Many different interaction techniques can be used for the same interaction task

6. Interaction tasks (Foley, 1990) interaction task – “entry of a unit of information by the user” 4 basic interaction tasks (BITs) position => location (x,y or x,y,z) text => string select => element from choice set (command, attribute value, object) quantify => numeric value 6

7. Users, input devices and applications 7

8. What is the scope of an interaction technique? Is a pull down menu an interaction technique? 8

9. What is the scope of an interaction technique? Or is the combination of the user moving the Mouse + button press + menu? 9

10. What is the scope of an interaction technique? Is keyboard activation of a menu a different ‘technique’ from mouse activation of the same menu? 10

11. What is the scope of an interaction technique? Does an interaction technique include feedback? 11

12. What is the scope of an interaction technique? Does an interaction technique include feedback? 12

13. Scope of an interaction technique 13

14. Gaze architecture 1: Application knows about gaze events Supporting display specific to gaze behaviour (allows ‘gaze aware’ interaction technique) 14

15. Gaze architecture 2: Application knows about nothing about gaze events normal input device replaced by eye tracker 15

16. Gaze architecture 2: event emulator generates mouse and keyboard events from gaze behaviour 16

17. Gaze architecture 2 extended: application notifies emulator about its objects using pre-compiled dll, events modified accordingly 17

18. Components of an interaction technique User action Input device response and associated events and feedback to user Supporting visual (and auditory) display of action, range of values or options available Feedback of current action, value, option via supporting display (Feedback through display of application object not included as part of ‘interaction technique’) 18

19. Models of Interaction Predictive Models enable metrics of performance to be determined analytically without undertaking experiments Descriptive Models provide a structure for thinking about a situation or problem (for example a taxomony) 19

20. Logical devices and interaction techniques Input devices = various physical objects which produce input data Logical devices = generic classes of input in interactive graphics locator – indicate position or oreintation pick – select a displayed entity valuator – input single real number keyboard – input character string Choice – select from set of possible actions or choices 20

21. 21 Fitts’ Law Fitts (1954) likened the human psychomotor system to a noisy communications channel Linear relationship between movement time and task difficulty MT = a + b log2(D/W + 1.0) log2(D/W + 1.0) = Index of task difficulty 1/b = Index of Performance MT = movement time

22. Forms of Fitt’s Law MT = a+b log2 (2A/W) – Fitt’s original formulation MT = a + b log2(D/W + 0.5) – Welford’s Modification MT = a + b log2(D/W + 1.0) – MacKenzie’s Modification compare signal to noise ratio (S + N)/N 22

23. Throughput and Effective Target Width To overcome the problem of combining speed and accuracy, the actual target width (W) is modified to the width(We) that accommodates 96% of end-coordinates We = 4.133 × (SD of distribution of selection coordinates) Throughput = (log2(D/We + 1.0)/MT) = IDe/MT 23

24. 1. 24 Characterizing Devices Different input devices add more or less noise to the human motor channel The better the device the less noise is added, the higher the Throughput It is a task independent means of describing device performance Scientific basis for the popularity of the mouse following work by Card, English and Burr

25. 25 Input Device Comparisons

26. Fitts Parameters across studies (Mackenzie) 26

27. Scott MacKenzie www.yorku.ca/mack Classic paper on the application of Fitts Law to HCI (1992) Has done a great deal to advance the thinking about input devices and performance measurement 27

28. Differences in muscle groups Langolf, Chaffin and Foulke (1976) Small amplitude task under a microscope and Fitts reciprocal tapping task Fingers – 38 bits/s Wrist – 23 bits/s Arm – 10 bits/s Balakrishnan and MacKenzie (1997) repeated the experiments using a digitizing tablet and stylus 28

29. Big Question.. Does Fitt’s Law model eye pointing? .. so is a measure of Throughput a useful metric? Hand-eye motor control system uses visual feedback to control movement Saccades are (essentially) ballistic 29

30. ISO 9241 Fitt’s original task (above) Task used in standard (right) 30

31. Outcomes with eye pointing Zhang and MacKenzie (2007) applied the ISO standard to eye pointing 31

32. 32 Early (milestone) objective performance comparisons of gaze-based interaction techniques (CHI 1987)

33. Ware and Mikaelian bb 33

34. Sibert and Jacob They found that Fitt’s Law did not model gaze pointing performance (in experiments comparing use of gaze and mouse for pointing) (eye) MT = 484.5 + 1.7(ID) (r2 = 0.02) (mse) MT = 155.3 + 117.7(ID) (r2 = 0.86) They concluded that cost of eye pointing is independent of distance (saccades over the distances tested take about the same time) ..so very good for large displays 34

35. Design space of input devices Basically an input device is a transducer that maps from the physical properties of the world into logical values of an application (Baecker and Buxton, 1987) 35

36. Design space of input devices Card, MacKinlay and Robertson model input devices in terms of primitive movement vocabulary set of composition operators to evaluate them with respect to expressiveness and effectiveness 36

37. Design space representation 37

38. Movement Vocabulary M= manipulation operator (which quantity and dimension) In = input domain S = current state of the device R = resolution function to maps from input set to output set Out = output domain set W = general set of device properties 38

39. Example: controlling a radio 39

40. Relation to interaction techniques C,M and R regard ‘supporting displays’ in an interaction technique as a virtual input device the output from one device serves as the input to another device so we have a chain of input devices.. 40

41. Comparing Interaction techniques 41

42. Use a head mouse instead precision of head mouse meets requirements for field of view control .. but not for object selection 42

43. Representing translators as soft devices mappings between input characteristics and interaction technique requirements can be shown as translations (Bates and Istance, 2000) 43

44. Representing translators as soft devices 44

45. Pro’s and Con’s of CMR’s design space Advantages makes (many) device characteristics explicit make mapping from one device space to another explicit Disadvantages doesn’t capture a sequence of actions needed to complete an input task doesn’t represent feedback 45

46. Capturing sequences of events Buxton (1990) proposed a 3-state model of graphical input a device could be out of range (state 0) tracking (state 1) dragging or active (state 2) he proposed categories of interface transactions in terms of these states he matched device states to transaction states 46

47. State transition diagram (Buxton) Actions shown use the mouse as an example only 47

48. Transactions and devices 48

49. Pro’s and Con’s of Buxton’s model Advantages captures idea of a sequence of actions matches transactions to device behaviour Disadvantages limited in its expressiveness doesn’t represent feedback 49

50. Characteristics of Interaction techniques range of device movement needed number of input actions needed duration of action sequence accuracy and precision footprint of physical device, and supporting display/soft device amount/quality/location of feedback support for mappings from input domain to output domain memory load 50

51. Location of Feedback and gaze interaction Separation of the input point and the feedback point requires checking whether values entered are correct as user intended 51

52. ExperiScope (Guimbretière et al, 2007) Interaction visualisation technique Command Area (Top) and Task Area (Bottom) 52

53. Piecursors Collection of tools arranged as a radial tracking menu, and shrunk to a cursor (Fitzmaurice et al, 2008) Intention to merge pointing and command selection 53

54. PieCursor compared with Marking menus Extended Experiscope notation 54

55. Characteristics of PieCursors range of device movement needed number of input actions needed duration of action sequence accuracy and precision footprint of physical device, and supporting display/soft device amount/quality/location of feedback support for mappings from input domain to output domain memory load 55

56. Two handed interaction Guiard’s Kinematic Chain Theory.. general idea is (for right-handed people) 1. Right-to-left reference: The right hand performs its motion relative to the frame of reference set by the left hand. 2. Left hand precedence: The left hand precedes the right: for example, the left hand first positions the paper, then the right hand begins to write. 3. Left-right scale differentiation: granularity of action of the left hand is coarser than the right 56

57. Toolglasses… (ACM SigGraph 1993) 57

58. 58 Real World Props Intention is to reduce cognitive mapping between control intent and required input action Instrument real world objects (or props) with spatial trackers Make position and orientation of real world objects drive corresponding virtual objects Let user manipulation object of interest with non-preferred hand and apply some control action with prop held in preferred hand

59. 59 Visualising brain scans (Hinckley)

60. Summing up.. We have looked at 2 representations of the design space of input devices .. And at an interaction visualisation technique We need to be clear about how one new interaction technique differs from exisitng ones, and what benefits in performance we think they will offer 60

  • Login