1 / 31

Presentation for IEEE VR’13 Estimating the Gaze of a Virtuality Human

P319. Presentation for IEEE VR’13 Estimating the Gaze of a Virtuality Human. David J. Roberts (Presenting), John Rae, Tobias W. Duckworth, Carl M. Moore, and Rob Aspin. Motivation. ?. Faithful communication of appearance and attention?. Faithful communication of attention.

clove
Download Presentation

Presentation for IEEE VR’13 Estimating the Gaze of a Virtuality Human

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. P319 Presentation for IEEE VR’13Estimating the Gaze of a Virtuality Human • David J. Roberts (Presenting), • John Rae, Tobias W. Duckworth, Carl M. Moore, and Rob Aspin

  2. Motivation ? Faithful communication of appearance and attention? Faithful communication of attention Faithful communication of appearance Video Conferencing Immersive Collaborative Virtual Environments Immersive VirtualityTelepresence ✔ ✔

  3. Shape from silhouette Source image form camera 1 Reconstruct shape from silhouettes Texture shape from original images Typical method for creating full 3D avatars from live video streams

  4. However, camera arrangement impacts on shape and texture application to it BAD GOOD GOOD

  5. Motivation Sketch by Wollaston in 1824

  6. Related Work – Eye-gaze in Telepresence Attention can be faithfully communicated in ICVE’s and appearance in Video Conferencing. Speculation that Video Based Reconstruction could faithfully communicate both what someone is looking at and looked like. Roberts et al IEEE VR’09 Video based reconstruction used to capture and reconstruct occupant of CAVE M. Gross et al 2003 Empirical evidence that eye gaze can be supported through video based reconstruction but only tested in unrealistically favorable conditions Kim et al 2012 For other related work see article

  7. Samples of our previous work

  8. Related work – Psychology Estimation of gaze relies on a comparison of eyes and head/face W.H. Wollaston, 1824 Most social eye gaze behaviour is between people < 4m apart E. Hall, 1966 White of eyes being especially prominent in humans might be linked to social importance Argyle and M. Cook, 1976 For other related work see article.

  9. Point of departure Both framing of eyes in head and relative turn of each, impacts on real world gaze estimation Clarity of representation of relative turn of eye and head in Virtuality Human impacted by the relationship between camera arrangement and VBR process Impact of parallax of display of IVT on head gaze estimation had been studied but only under unrealistically favourable conditions of camera placement and eye orientation

  10. Theory - Impact of technology on shared space Access GRID Narrow baseline Teleimmersion Collaborative Virtual Environments Immersive Collaborative Virtual Environments

  11. Core Principle To communicate gaze between people moving around ateleshared place(s) require that both: body, head and eyes of avatar & viewpoint into shared space move with each person

  12. Research Questions It harder to estimate someone’s gaze when their eyes are turned Is this also the case for virtuality humans? Determining being looked at is important in social interaction Is it possible to accurately determine if a virtuality human is looking at you? Fewer texturing cameras = less bandwidth = lower latency Is it harder to estimate gaze when texture is taken from few cameras? Cameras often go above displays in telepresence settings Is it harder to estimate gaze when cameras looking down steeply?

  13. Hypotheses H1 relative orientation of eyes but not body to head will significantly impact on the accuracy of estimation of gaze from a virtualityhuman H2 Gaze can be estimated through VBR to accuracy underpinning social gaze in the natural world. i.e. 4o@≈4m H3 Reducing the number of texture cameras will significantly impact on gaze estimation H4 Increasing the steepness of texture camera to face will significantly impact on gaze estimation

  14. Approach Task n=22 participants rotated virtuality human until feeling most looked at Independent Variable Relative orientations of head, body and eyes Cameras used for texturing Dependent Variables Accuracy of estimation Analysis Quantitative: Statistical significance p=<.05; Practical significance: quartile moves across 4o line by > 1o Qualitative: Examining reconstructions and comparing to source images

  15. Software environment – Analysis mode

  16. Gaze Poses Subject captured while orientating body , head and eyes toward three respective targets All combinations of: eyes, head and body; left, ahead and right Captured subject chosen due to outstanding physical appearance

  17. Camera Arrangements

  18. H1 Estimations significantly better when eyes centred in head Over all gaze poses a median accuracy of 4° is achieved by Surround and Arc with both Shallow Single and Pair within 1° of this limit. SteepSingle stands out as clearly worst performer

  19. H2 estimations to accuracy underpinning social gaze Across all camera arrangements, median of median estimations always and only below 4o when eyes centred (x0x)

  20. H3 Accuracy of estimation proportional to number of cameras Median accuracy increases with number of texturing cameras both when eyes centred and turned, being within 4o when centred.

  21. H4 steepness of texture camera to face impact on accuracy of estimation Estimations when texture camera close to eye level outperform those when a texture camera looking down at a steeper angle. At worse when eyes turned and camera angle steeper

  22. Qualitative Analysis Ordered in terms of accuracy of estimation Surround Arc Pair Single Shallow Single Steep Interestingly estimations less accurate from more “life like” reproductions than from those where boarders between textures are clearly visible It is also interesting that estimation is less accurate from a steeper texture camera

  23. Qualitative Analysis Surround -> dark of eyes constant shape across posses Shallow (single front) -> dark of eyes stretched and nose twists in some gaze poses

  24. Stretching of dark of eyes Dark of eye appears stretched when viewpoint long way from texture camera Resolved when texturing from surround cameras (0,0,0)

  25. Causes of stretching of dark of eye For(L’,L,R): Camera to side of face struggles to resolve shadow in corner of eye from dark of eye Textures viewed from a very different angle to that captured and projected can appear to stretch (L’,L,R) Less Stretch with (R',R,L) - corner of eye not in shadow

  26. Impact of steepness of texture camera

  27. Impact of steepness of texture camera Face appears to droop when viewed from below texture camera Scale of droop evident when both shallow and steep textures used

  28. Tested hypothesis H1 proved to be clearly true Relative orientation of eyes but not body to head will significantly impact on the accuracy of estimation of gaze Practically significant - when eyes centred: 3o better; and within the 4o limit of social gaze perception Statistically significant p=.001 H2 proved to be clearly true Gaze can be estimated through VBR to accuracy underpinning social gaze in the natural world Surround camera arrangement allows participants to estimate gaze to within 4o for upper quartile of samples

  29. Tested hypothesis H3 proved to be partially (barely) true Reducing the number of texture cameras will significantly impact on gaze estimation Arguably of practical significance - Median crossed 4deg line but by < 1deg Statistical significance between surround and pair only. p=.031 H4 was proved to be clearly true Increasing the steepness of texture camera to face will significantly impact on gaze estimation Practically significant – lower quartile moved across 4deg line and median by 3degs Statistically significant p=.031

  30. Conclusions Eye gaze can be estimated from a virtuality human to accuracies underpinning social eye gaze behaviour in natural world However, unless cameras placed correctly rwt likely gaze poses, erroneous attributes of form and texture, in particular the relationship between the two, significantly impacts on accuracy of estimation Specifically estimation can achieve accuracies to allow a participant to determine if they are being looked at provided shallow angle between texture and camera and observer however, when eyes turned, a comprehensive array of cameras is preferable

  31. Thank you and Questions • Collaborators in Telepresence Research • UCL, London: Anthony Steed, Will Steptoe & WoleOyekoya • Salford, Manchester: NadimAdi, Oliver Otto, Norman Murray, & Terrance Fernando • DLR - German Space Science: Robin Wolff (formally Salford) • PERCO, Pisa:Franco Tecchi, Paolo & Giuseppe • Reading:Paul Sharkey & AlessioMurgia • Roehampton, London:EstefaniaGuimaraes, Paul Dickenson & Penny Stribling • Chalmers, Göteborg:IlonaHeldal • BBC R&D: Bruce Wier &Oliver Grau • UPdM, Madrid: Adriana Pena • LRZ Munic: Dieter Kransmuller & ChristophAnthes • NUIM, Dublin: Damien Marshall & Thomas Ward Acknowledgements The authors wish to thank the EPSRC and OMG VICON for funding of PhD students, HEFCE for funding the equipment under SRIF, and John O’Hare of Salford for technical support More About Me https://www.researchgate.net/profile/David_Roberts15 www.cve.salford.ac.uk

More Related