1 / 27

Eye Movement-Based Interaction ” What You Look At Is What You Get ” (WYLAIWYG)

Eye Movement-Based Interaction ” What You Look At Is What You Get ” (WYLAIWYG). Tampere University Computer Human Interaction Group. Aulikki Hyrskykari 19th January 2000. Eye Movement-Based Interaction . Eye on/in the interface (2)

livia
Download Presentation

Eye Movement-Based Interaction ” What You Look At Is What You Get ” (WYLAIWYG)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Eye Movement-Based Interaction”What You Look At Is What You Get ”(WYLAIWYG) TampereUniversityComputerHumanInteractionGroup Aulikki Hyrskykari19th January 2000

  2. Eye Movement-Based Interaction • Eye on/in the interface (2) • Problems and research issues: Technological/HCI issues (2) • Processing the eye movement data (5) • Eye as a control device (2) • Command based gaze interaction (9) • Noncommand gaze interaction (3) • References • Project ideas Eye on interface (2) Problems and research issues (2) EM data (5) Eye in control (2)Command based gaze interaction(9) Noncommand gaze interaction (3)

  3. Eye on/in the interface Constrained interface between two powerful information processors . • Goal to increase the bandwidth across the channel • Eyes are extremely rapid • Target acquisition usually requires the user to look at the target first before actuating the cursor control Eye on interface (1/2) Problems and research issues (2) EM data (5) Eye in control (2)Command based gaze interaction(9) Noncommand gaze interaction (3)

  4. Eye on/in the interface • Need for keeping the hands free (or the hands can not be used for other reasons) • Increasing number of computer users suffer from RSI (repetitive stress injury) • Eye movements are natural, little conscious effort • Direction of gaze implicitly indicates the focus of attention Eye on interface (2/2) Problems and research issues (2) EM data (5) Eye in control (2)Command based gaze interaction(9) Noncommand gaze interaction (3)

  5. Problems and Research Issues 1) Technological issues • Usability of the hardware • head mounted systems more reliable but somewhat awkward • floor mounted more comfortable but more constrained • Accuracy - need of calibration • for every user at the beginning of a task • also during the task • Costs of eye tracking (equipment) Eye on interface (2) Problems and research issues (1/2) EM data (5) Eyes in contro l(2)Command based gaze interaction(9) Noncommand gaze interaction (3)

  6. Problems and Research Issues 2) HCI issues • Need to design and study new interaction techniques • Eyes are a perceptual device, not evolved to a control organ • people are not used to operate things by simply looking at them - if poorly done it could be very annoying • Noisy data - need to refine in order to get useful dialogue information (fixations, input tokens, intentions) • accuracy restricted by biological characteristics of the eye Eye on interface (2) Problems and research issues (2/2) EM data (5) Eyes in control (2)Command based gaze interaction(9) Noncommand gaze interaction (3)

  7. Processing the EM data • Scanpaths (3 min of raw eye data) when a subject was asked to answer to three different questions concerning the painting [Yarbus67] • The data contains jittery, errors(originating from the limited accuracy), failures of tracking … • At the lowest level the raw eye position data must be filtered and the fixations identified • When analyzing the eye movement data off-line the noisy data can be refined using different filtering algorithms before counting the fixations, in real time the analysis must be more simple Eye on interface (2) Problems and research issues (2) EM data (1/5) Eyes in control (2)Command based gaze interaction(9) Noncommand gaze interaction (9)

  8. Processing the EM dataFiltering the noisy data 2/5 4/1/2014 A simple algorithm for identifying the fixations in real-time [Siebert00]: X 1) Fixation starts when theeye position stays within 0.5o > 100 ms (spatialand temporal thresholds filter the jitter) 2) Fixation continues as long as the position stays within 1o 3) 200 ms failures totrack the eye does not terminate the fixation time Eye position X-coordinates (~3 secs) Eye on interface (2) Problems and research issues (2) EM data (2/5) Eyes in control (2)Command based gaze interaction(9) Noncommand gaze interaction (3)

  9. Processing the EM dataScanpaths with fixations identified 3/5 4/1/2014 • Visualization of scanpaths: • circles are the fixations (center is the point of gaze during the fixation) • radius depicts the length of the fixation • lines are the saccades between fixations Eye on interface (2) Problems and research issues (2) EM data (3/5) Eyes in control (2)Command based gaze interaction(9) Noncommand gaze interaction (3)

  10. 4/5 Processing the EM data (4/5)Input tokens [Siebert00] • The fixations are then turned into input tokens • start of fixation • continuation of fixation (every 50 ms) • end of fixation • failure to locate eye position • entering monitored regions • The tokens formulate eye events • are multiplexed into the event queue stream with other input events • The eye events also carry information of the fixated screen object (using nearest neighbor approach) 4/1/2014 Eye on interface (2) Problems and research issues (2) EM data (4/5) Eyes in control (2)Command based gaze interaction(9) Noncommand gaze interaction (3)

  11. Processing the EM data Deducing user’s intentions • Objective • to refine the data further on for recognizing the user’s intentions • to implement a higher level programming interface for gaze aware applications • Eye Interpretation Engine, objective to identify such behaviors as [Edwards98] • the user is reading • just “looking around” • starts and stops searching for an object (e.g. a button) • wants to select an object 4/1/2014 Eye on interface (2) Problems and research issues (2) EM data (5/5) Eye in control (2)Command based gaze interaction(9) Noncommand gaze interaction (3)

  12. Eye as a control device • Gaze behavior very different from other devices used for controlling computer (hand, voice, feet) • intentional control of eyes is difficult and stressful, the gaze is easily driven by external events • precise control of eyes difficult • “Midas touch” problem • Most of the time the eyes are used for obtaining information with no intent to initiate commands • Users are easily afraid of looking at the “eye active” objects or areas of the window Eye on interface (2) Problems and research issues (2) EM data (5) Eye in control (1/2)Command based gaze interaction(9) Noncommand gaze interaction (2)

  13. Eye as a control deviceJacob’s taxonomy Jacob’s taxonomy of possible approaches for using gaze input in the user interface: 4/1/2014 Unnatural response Natural response Unnatural (learned)eye movement A. Commandbased interfaces Naturaleye movement C. Virtual environments B. Noncommandinterfaces Eye on interface (2) Problems and research issues (2) EM data (5) Eye in control (2/2)Command based gaze interaction(9) Noncommand gaze interaction (3)

  14. Command based gaze interaction • Even though eye movements are an old research area gaze aware applications practically do not exist • Exception: applications for disabled © Erica, Inc. http://www.ericainc.com Eye on interface (2) Problems and research issues (2) EM data (5) Eye in control (2)Command based gaze interaction(1/9) Noncommand gaze interaction (2)

  15. Command based gaze interactionApplications for disabled 4/1/2014 ©LC Technologies, Inc. http.//www.lctinc.com/doc/ecs.html Eye on interface (2) Problems and research issues (2) EM data (5) Eye in control (2)Command based gaze interaction(2/9) Noncommand gaze interaction (3)

  16. Command based gaze interactionApplications for disabled 4/1/2014 ©LC Technologies, Inc. http.//www.lctinc.com/doc/ecs.html Eye on interface (2) Problems and research issues (2) EM data (5) Eye in control (2)Command based gaze interaction(3/9) Noncommand gaze interaction (3)

  17. Command based gaze interactionApplications for disabled 4/1/2014 ©LC Technologies, Inc. http.//www.lctinc.com/doc/ecs.html Eye on interface (2) Problems and research issues (2) EM data (5) Eye in control (2)Command based gaze interaction(4/9) Noncommand gaze interaction (3)

  18. Command based gaze interaction Selection 4/1/2014 • Most obvious task is selection of an object • Midas touch problem must be resolved, different solutions, we may use • dwell time • screen buttons • eye movement (e.g. wink) for selection • hardware buttons (e.g. space bar, or mouse) • for performing the selection in the position of gaze • Experiments have proven that gaze selection is faster than mouse selection [Ware87, Jacob94, Jacob99] • Accuracy problem - target objects must not be small ©LC Technologies, Inc. http.//www.lctinc.com/doc/ecs.html Eye on interface (2) Problems and research issues (2) EM data (5) Eye in control (2)Command based gaze interaction(5/9) Noncommand gaze interaction (3)

  19. Command based gaze interaction Selection 4/1/2014 4/1/2014 • In separate experiments [Ware87, Jacob98] on eye selection dwell time noted the most convenient • Dwell time >150 ms. • too long, sticky feeling (especially with expert users) • too short, wrong selections • Winks have been used for implementing selection for disabled users, who can not use additional control devices Eye on interface (2) Problems and research issues (2) EM data (5) Eye in control (2)Command based gaze interaction(6/9) Noncommand gaze interaction (3)

  20. selectionmark Command * command name selection area Command based gaze interactionSelection 4/1/2014 4/1/2014 • Screen buttons • EyeCon - visual feedback of the selection [Glenstrup95] • Quick glance menu selection method [Ohno98] • faster than mouse • more errors than with mouse • lack of good canceling method Eye on interface (2) Problems and research issues (2) EM data (5) Eye in control (2)Command based gaze interaction(7/9) Noncommand gaze interaction (3)

  21. Command based gaze interaction Selection 4/1/2014 4/1/2014 • Magic pointing[Zhai99] • Combined use of gaze and mouse • gaze is used to warp the cursor to the vicinity of the target object • threshold circle, in the circle the gaze does not affect the cursor • the fine adjustment is done by the mouse Gaze positionreported by eyetracker Target will be within the circle The cursor is warped to eyetracking position Eye tracking boundarywith 99% confidence • Two different approaches were experimented with • the cursor warps to every new object the user looks at (“liberal”) • the cursor does not warp until the user actuates the cursor (“conventional”) • Conventional way was slightly slower than plain mouse selection, but the liberal way was faster than mouse • Test persons’ reactions positive Eye on interface (2) Problems and research issues (2) EM data (5) Eye in control (2)Command based gaze interaction(8/9) Noncommand gaze interaction (3)

  22. Here we have a text window. Usually we have to grab the mouseand click in the scroll- bar when we want to read the text on the next page, now just look at the arrows Command based gaze interaction- menus, dragging, scrolling, window manipulation 4/1/2014 [Jacob98] • Gaze controlled pull down menus • using dwell time did not work out very well, the time was either too long or too prone to errors • gaze+hardware button worked better • Dragging of objects (with gaze only, with gaze + hardware button) • performed better than most of the other experiments • using the gaze + hardware button felt natural 4/1/2014 4/1/2014 • Scrolling text in a window • Listener window control Eye on interface (2) Problems and research issues (2) EM data (5) Eye in control (2)Command based gaze interaction(9/9) Noncommand gaze interaction (2)

  23. Noncommand gaze interfaces • Multimodal interfaces head towards task-oriented (and user oriented) interfaces instead of command oriented • In non-command interfaces the computer monitors the user’s actions instead of waiting user’s commands [Nielsen93] • In most cases the natural eye movement information could be valuable information for the application Eye on interface (2) Problems and research issues (2) EM data (5) Eye in control (2)Command based gaze interaction(10) Noncommand gaze interaction (1/3)

  24. Noncommand gaze interfaces • Little Prince -application [Staker90] • an example of “IES-media” (interest and emotion sensitive) • iEye -project (University of Tampere, SMI/Germany,GiuntiIlabs/Italy, Conexor/Espoo and University of Nottingham/England, started in January 2000) • Ship databaseexample[Jacob98][Siebert00] Eye on interface (2) Problems and research issues (2) EM data (5) Eye in control (2)Command based gaze interaction(10) Noncommand gaze interaction (2/3)

  25. Noncommand gaze interfaces (3/3)Eye position recognition 4/1/2014 • 3D-displays exploiting eye position recognition Eye on interface (2) Problems and research issues (2) EM data (5) Eye in control (2)Command based gaze interaction(10) Noncommand gaze interaction (3/3)

  26. References [Glenstrup95] Glenstrup Arne John , Engell-Nielsen Theo, Eye Controlled Media: Present and Future State. Published as a thesis at the University of Copenhage, Institute of Computer Science. Available in http://www.diku.dk/~panic/eyegaze/article.html [Jacob98] R.J.K. Jacob, "The Use of Eye Movements in Human-Computer Interaction Techniques: What You Look At is What You Get. ACM Transactions on Information Systems, 9 (3), 152-169, 1991. Also reprinted with commentary in Readings in Intelligent User Interfaces, ed. M.T. Maybury and W. Wahlster, Morgan Kaufmann, San Francisco, 1998, 65-83. [Nielsen93] Nielsen Jakob, Noncommand interfaces. CACM 36, 4, 1993, 83-99. [Ohno98] Ohno Takehiko, Features of eye gaze interface for selection tasks. APCHI’98, Japan, 1998, 176-181. [Siebert00] Siebert Linda E. and Jacob Robert J. K.,” Evaluation of Eye Gaze Interaction,” submitted to in the Proc. of ACM CHI 2000. (available in http://www.eecs.tufts.edu/~jacob/papers/chi00.sibert.pdf ) [Staker90] Staker India and Bolt Richard A., A gaze -responsive Self -Disclosing Display. ACM CHI’90, 3-9. [Ware87] Ware Colin and Mikaelian Harutun H., An Evaluation of an Eye Tracker as Device for Computer Input. Proc. ACM CHI'87, 183-188. [Yarbus67] Yarbus, A. L. (1967). Eye movements during perception of complex objects, in L. A. Riggs, ed., Eye Movements and Vision, Plenum Press, New York, chapter VII, 171-196. [Zhai99] Zhai Shumin, Morimoto Carlos, and Ihde Steven, Manual and Gaze Input Cascaded (MAGIC) pointing. In Proc. of ACM CHI 1999. 246-253. Eye on interface (2) Problems and research issues (2) EM data (5) Eye in control (2)Command based gaze interaction(10) Noncommand gaze interaction (2)

  27. Project ideas • Combined programming and research projects • Comparison of fixation algorithms • Implementation and Evaluation of Magic Pointing • Evaluating (some) gaze controlled interaction components (implementation and tests) • Eye behavior when watching stereograms • Programming projects • Gaze assisted word processing (selection of text) • Gaze control in window management • Implementation of real-time gaze trail visualization and playback environment • Research projects • Gaze control in virtual environments • 3D Vision

More Related