1 / 30

Eyes Alive

Eyes Alive. Sooha Park - Lee Jeremy B. Badler - Norman I. Badler University of Pennsylvania - The Smith-Kettlewell Eye Research Institute Presentation Prepared By: Chris Widmer CSE 4280. Outline. Introduction Motivation Background Overview of System Descriptions Results Conclusions.

Download Presentation

Eyes Alive

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Eyes Alive Sooha Park - Lee Jeremy B. Badler - Norman I. Badler University of Pennsylvania - The Smith-Kettlewell Eye Research Institute Presentation Prepared By: Chris Widmer CSE 4280

  2. Outline • Introduction • Motivation • Background • Overview of System • Descriptions • Results • Conclusions

  3. Introduction • Eye movement an important expression technique • Statistical eye movement model based on empirical data

  4. Motivation • Natural look eye movement for animations of close-up face views • Traditionally difficult to attain accurate eye movement in animations • No proposals for saccadic eye movement for easy use in speaking/expressive faces • Recent interest in construction of human facial models

  5. Background • To build a realistic face model: • Geometry modeling • Muscle behavior • Lip synchronization • Text synthesis • Research has traditionally not focused on eye movement.

  6. Background • Eyes are essential for non-verbal communication • Regulate flow of conversation • Search for feedback • Express emotion • Influence of behavior New approach based on statistical data and empirical studies

  7. Saccades • Rapid movements of both eyes from one gaze position to another. • Only Eye Movement Executed Consciously • Balance conflicting demands of speed and accuracy • Magnitude – angle the eyeball rotates to change position • Direction – 2D axis of rotation, 0 degrees to the right • Duration – Time of movement • Inter-saccadic Interval – time between saccades

  8. Saccades • Example: Magnitude 10, 45 degrees • Rotating 10 degrees, right-upward • Initial/Final Acceleration: 30,000 deg/sec • Peak Velocity – 400 – 600 deg/sec • Reaction Time: 180 – 220 msec • Duration and Velocity Functions of Magnitude • Magnitude Approximation • D = D0 + d * A • D = Duration, A = Amplitude, d = increment in duration per degree (2-2.7 msec/deg), D0 = Intercept (20-30 ms) • Often accompanied by head rotation

  9. Background • Three Functions of Gaze • Sending Social Signals • Open Channel to Receive Information • Regulate Flow of Conversation

  10. Overview of System • Eye-tracking images analyzed, statistically based model generated using Matlab • Lip movements/Eye Blinks/Head Rotation analyzed by alterEGO face motion analysis system

  11. Overview of System • Face Animation Parameter (FAP) File • Eye Movement Synthesis System (EMSS) • Adds eye movement data to FAP file • Modified from face2face’s animator plug-in for 3D Studio Max

  12. Analysis of Data • Eye movements recorded with eye-tracking visor (ISCAN) – monocle and two miniature cameras • One views environment from left eye perspective, other is close-up of left eye • Eye image recorded • Device tracks by comparing corneal reflection of the light source relative to the location of the pupil center • Reflection acts as reference point while pupil changes during movement

  13. Analysis of Data • Pupil position found using pattern mapping • Default threshold grey level using Canny Edge Detection operator • Positional histograms along X and Y axis calculated • Two center points with maximum correlation chosen

  14. Analysis of Data

  15. Analysis of Data

  16. Analysis of Data • Saccade Magnitude • Frequency of a specific magnitude (least mean squares distribution) • d = Distance traversed by pupil center • r = radius of eyeball (1/2 of xmax • P = % chance to occur • A = Saccade Magnitude (Degrees)

  17. Analysis of Data • Saccade Duration measured with 40 deg/sec threshold • Used to derive instantaneous velocity curve for every saccade • Duration of each movement normalized to 6 frames • Two classes of Gaze: • Mutual • Away

  18. Talking vs. Listening

  19. Synthesis of Eye Movement • Attention Monitor (AttMon) • Parameter Generator (ParGen) • Saccade Synthesizer (SacSyn)

  20. Head Rotation Monitoring

  21. Synthesis of Natural Eye Movement • AttMon determines mode, changes in head rotation, gaze state • ParGen determines saccade magnitude, direction, duration, and instantaneous velocity • SacSyn synthesizes and codes movements into FAP values

  22. Synthesis of Natural Eye Movement • Magnitude determined by inverse of fitting function shown earlier (Slide 16) • Mapping guarantees same probability distribution as empirical data • Direction determined by head rotation (threshold), and distribution table • Uniform Distribution, 0 to 100 • 8 non-uniform intervals assigned to respective directions

  23. Synthesis of Natural Eye Movement • Duration determined by first equation, respective values for d and D0 • Velocity determined by using fitted instantaneous velocity curve • SacSyn system calculates sequence of coordinates for sys centers • Translated into FAP values, rendered in 3D Studio MAX • Face2face animation plug-in to render animations with correct parameters

  24. Results • 3 Different Methods Tested • Type 1 -> No Saccadic Movements • Type 2 -> Random Eye Movement • Type 3 -> Sampled from Estimated Distributions (synchronized with head movements) • Tests were subjective

  25. Results • Q1: Did the character on the screen appear interested in (5) or indifferent (1) to you? • Q2: Did the character appear engaged (5) or (1) distracted during the conversation? • Q3: Did the personality of the character look friendly (5) or not (1)? • Q4: Did the face of the character look lively (5) or deadpan (1)? • Q5: In general, how would you describe the character?

  26. Results

  27. Conclusions • Saccade Model for Talking and Listening Modes • 3 Different Eye Movements: Stationary, Random, Model-based • Model-based scored significantly higher • Eye Tracking Data recorded from a subject • New recorded data for every character This model allows any number of unique eye movement sequences.

  28. Drawbacks and Improvements • Aliasing with small movements • Sensing of eye movement vs. head movement during data gathering • Future Enhancements • Eye/Eyelid data • More model gaze patterns • More subjects for data • Scan-path model for close-up images

  29. Developments • J. Badler, Director, Center for Human Modeling and Simulation • Digital Human Modeling/Behavior • “Jack” Software • Simulation of workflow using virtual people

  30. References • Badler, Jeremy B., Badler, Norman I, Lee, Sooha Park, “Eyes Alive • http://www.cis.upenn.edu/~sooha/home.html • http://www.cis.upenn.edu/~badler/ • http://cg.cis.upenn.edu/hms/research.html

More Related