1 / 60

Contribution to the study of visual, auditory and haptic rendering of information of contact in virtual environments

Contribution to the study of visual, auditory and haptic rendering of information of contact in virtual environments. 9/12/2008. Jean Sreng. Advisors: Claude Andriot Anatole Lécuyer. Director: Bruno Arnaldi. Introduction. Context: Manipulation of solid objects in Virtual Reality

lafayette
Download Presentation

Contribution to the study of visual, auditory and haptic rendering of information of contact in virtual environments

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Contribution to the study of visual, auditory and haptic rendering of information of contact in virtual environments 9/12/2008 Jean Sreng Advisors: Claude Andriot Anatole Lécuyer Director: Bruno Arnaldi

  2. Introduction • Context: Manipulation of solid objects in Virtual Reality • Example applications: industrial virtual assembly / disassembly / maintenance • Focus: perception, simulation, rendering and of contacts between virtual objects

  3. Outline • State of the art on perception, simulation and rendering of contact • Contributions • Integrated 6DOF multimodal rendering approach • Visual of rendering of multiple contacts • Spatialized haptic rendering of contact • Conclusion Integrated 6DOF Multimodal rendering OUTLINE Visual rendering of multiple contacts Spatialized haptic rendering

  4. Human perception of contact Perception of contact Simulation of contact Rendering of contact • Visual perception of contact: • Stereoscopy (Hu et al. 2000) • Motion parallax (Wanger et al. 1992) • Shadows (Wanger et al. 1992, Hu et al. 2002) • Auditory perception of contact: • Contact properties can be directly perceived (Gaver 1993) • Contact sounds conveys information about shape and material (Klatzky 2000, Rochesso 2001)

  5. Haptic perception Perception of contact Simulation of contact Rendering of contact • Haptic perception provides an intuitive way to feel the contact(Loomis et al. 1993) : • Tactile perception (patterns at the surface of the skin) • Kinesthetic perception (position and forces) • Physical properties can be perceived through the contact (Klatzky et al. 2003): • Shape / Texture / Temperature • Weight / Contact forces • Perception of contact features through vibrations • Material (Okamura et al. 1998) • Texture (Lederman et al. 2001)

  6. Multimodal perception of contact Perception of contact Simulation of contact Rendering of contact • Known Interaction between modalities • Visual-Auditory interaction • Ex: Sound can shift the perception of impact (Sekuler et al. 1997) • Visual-Haptic interaction • Ex: Pseudo-haptic feedback (Lécuyer et al. 2002) • Auditory-Haptic interaction • Ex: Sound can modulate the perception of roughness (Peeva et al. 1997) “Stiff” “Soft”

  7. Simulation of contact Perception of contact Simulation of contact Rendering of contact • Multiple contact models (impact/friction) • Rigid (Newton/Huygens impact law, Coulomb/Amontons friction law) • Locally deformable (Hunt-Crossley impact law, LuGre friction law) • Multiple simulation methods • Collision detection • VPS(McNeely 1999) • LMD (Johnson 2003) • Physical simulation • Constraint based (Baraff 1989) • Penality (Moore 1988)

  8. Visual rendering Perception of contact Simulation of contact Rendering of contact • Visual rendering of the information of contact • Proximity : • Color (McNeely et al.2006) • Contact : • Color (Kitamura et al.1998) • Glyph (Redon et al. 2002) • Force : • Glyph (Lécuyer et al. 2002)

  9. Auditory rendering Perception of contact Simulation of contact Rendering of contact • Realistic rendering of contact sounds • Specific: Impact/Friction/Rolling • Different techniques: FEM(O’Brien 2003), modal synthesis(Van den Doel 2001) • Symbolic rendering (Richard et al. 1994, Massimino 1995, Lécuyer et al. 2002) • Associate an information to a sound effect • Information: Distances / Contact / Forces • Sound effect: Amplitude / Frequencies

  10. Haptic display of contact Perception of contact Simulation of contact Rendering of contact • Haptic devices (Burdea 96) : • Force feedback • Tactile feedback • Haptic rendering of contact : • Closed loop (McNeely et al. 1999, Johnson 2003) tradeoff between stability / stiffness • Open loop (Kuchenbecker et al. 2006) improve the realism of impact Virtual Env. Device Impact

  11. Objectives of this thesis • Improve the simulation, rendering, and perception of contacts in virtual environments Integrated 6DOF multimodal rendering • Protocol : • Integrated 6DOF approach for multisensory rendering techniques • Study of rendering techniques to improve the perception of contact position • Hypothesis of improvement • Experimental implementation • Experimental evaluation Visual rendering of multiple contacts Spatialized haptic rendering

  12. Outline Integrated 6DOF multimodal rendering OUTLINE Visual rendering of multiple contacts Spatialized haptic rendering

  13. Objectives • Multiplicity of techniques: • Contact simulation • Sensory rendering • How can we integrate all techniques seamlessly together independently from the simulation ? • Contribution / Overview • Contact formulation (states / events) • Example of contact rendering based on this formulation : Visual, Auditory, Tactile, Force feedback Integrated 6DOF multimodal rendering

  14. Contact formulation • Simple contact formulation based on : • proximity points a, b • Force f Integrated 6DOF multimodal rendering

  15. Contact formulation • From this formulation : • Contact states • Temporal evolution of contact states (such as events) : • Higher level information • Adapted to many specific rendering techniques Integrated 6DOF multimodal rendering Free motion Contact Impact Detachment

  16. Determination of states and events Contact Impact Detachment Integrated 6DOF multimodal rendering • The contact condition : • The events are defined by : • Impact : • Detachment : Local linear velocity Normal

  17. Multimodal rendering architecture Integrated 6DOF multimodal rendering

  18. Multimodal rendering architecture • Superimpose states and events information: Integrated 6DOF multimodal rendering

  19. Example of Visual rendering Contact Impact Detachment Integrated 6DOF multimodal rendering

  20. Example of Auditory rendering Integrated 6DOF multimodal rendering

  21. Multimodal rendering platform • Multimodal: • Visual • Auditory • Tactile • force-feedback Integrated 6DOF multimodal rendering © Hubert Raguet / CNRS photothèque

  22. Preliminary conclusion • We proposed a contact formulation (proximity/force) : • Contact states and events • We developed a multimodal rendering architecture : • Visual (particles / pen) • Auditory (modal synthesis / spatialized) • Tactile (modal synthesis) • 6DOF Haptic enhancement (openloop) Integrated 6DOF multimodal rendering

  23. Outline Integrated 6DOF Multimodal rendering OUTLINE Visual rendering of multiple contacts Spatialized haptic rendering

  24. Objectives • Context: complex-shaped objects • Multiple contacts • Difficult interaction • Help the user by providing position information • Contribution / Overview • Display the information of proximity / contact / forces • Glyphs • Lights • Subjective evaluation Visual rendering of multiple contacts

  25. Visual rendering of multiple contacts • Visualizing multiple proximity / contact / forces positions Proximity Contact Contact forces FAR NEAR LOW HIGH Visual rendering of multiple contacts

  26. Visual rendering using glyphs Proximity Contact Contact forces FAR NEAR LOW HIGH Visual rendering of multiple contacts

  27. Visual rendering using glyphs Visual rendering of multiple contacts

  28. Glyph filtering • Reduce the number of displayed glyphs • determine “relevance” based on the movement Visual rendering of multiple contacts

  29. Glyph filtering • Reduce the number of displayed glyphs • determine “relevance” based on the movement Visual rendering of multiple contacts

  30. Glyph filtering • Reduce the number of displayed glyphs • determine “relevance” based on the movement Visual rendering of multiple contacts

  31. Glyph filtering • The relevance is determined by comparing : • The local velocity v • The local normal d d v Visual rendering of multiple contacts

  32. Visual rendering using lights • Two types of lights : • Spherical lights • Conical lights Visual rendering of multiple contacts

  33. Visual rendering using lights Visual rendering of multiple contacts

  34. Subjective evaluation • Objective: Determine user’s preferences about the different techniques • Procedure: Participants were asked to perform an industrial assembly operation • Without visual cues • With each visual cues • Conducted on 18 subjects • They had to fill a subjective questionnaire • Which effect : glyph / light / color change / size change / deformation • For which info : forces / distances / blocking / focus attention Visual rendering of multiple contacts

  35. Results • The visual effects were globally well appreciated • Significant effects : • Glyph Size effect globally appreciated (distance / force) • Glyph Deformation effect appreciated to provide force information • Light effect appreciated to attract visual attention Mean ranking “Lower is better” Visual rendering of multiple contacts

  36. Preliminary conclusion • We proposed a visual rendering technique to display multiple contact information • Display Proximity / Contact / Force • Using Glyphs/Lights • We presented a filtering technique to reduce the number of glyphs displayed • We conducted a subjective evaluation • Glyph size: proximity / force • Glyph deformation: force • Lights: focus the visual attention Visual rendering of multiple contacts

  37. Outline Integrated 6DOF Multimodal rendering OUTLINE Visual rendering of multiple contacts Spatialized haptic rendering

  38. Objectives • Context: Complex-shaped objects • Help the user by providing position information • Provide contact position information using : • Visual rendering (particles/glyphs/lights) • Auditory rendering (spatialized sound) • Can we provide this position information using haptic rendering ? • Contribution / Overview • Haptic rendering technique based on vibrations • Perceptive evaluation in a 1DOF case • 6DOF Haptic rendering technique • Perceptive evaluation to determine rendering parameters • Subjective evaluation in a 6DOF case Spatialized haptic rendering

  39. Haptic rendering of contact position • The impact between objects : • A reaction force of contact • A high frequency transient vibrations • This high-frequency transient vibrations depends on : • The object’s material (Okamura et. al 1998) • The object’s geometry • The impact position • Is-it possible to perceive the impact position information using these vibrations? Spatialized haptic rendering

  40. Vibrations depending on impact positions • Examine the vibrations produced by a simple object : A cantilever beam • The vibrations depend on the impact position Spatialized haptic rendering

  41. Simulation of vibrations (Euler-Bernouilli) • General solution : Euler Bernoulli model (EB) Spatialized haptic rendering

  42. Simplified vibration patterns • Simplified patterns based on the physical behavior • Maybe easier perception ? • Simplified computation • Chosen model : exponentially damped sinusoid • Amplitude changes with impact position • Frequency changes with impact position • Both amplitude and frequency changes Spatialized haptic rendering

  43. Simplified vibration patterns Am Fr AmFr (Consistent) AmCFr (Conflicting) Far impact Near impact

  44. Evaluation • Objective: “Determine it is possible to perceive the impact position using vibration” • Population: 15 subjects • Apparatus : • Virtuose6D device • Sound blocking noise headphones Spatialized haptic rendering

  45. Procedure • Task: Do two successive impacts. “Between these two impacts which one was the closest one from the hand ?” • 6 models • 2 realistic models (Euler-Bernoulli) (EB1, EB2) • 4 simplified models (Am, Fr, AmFr, AmCFr) • 4 impact positions • 8 random repetitions • Total of 576 trials (40 min) Spatialized haptic rendering

  46. Results of quantitative evaluation • “How well was the subject able to determine the impact position by sensing the vibration ?” • Overall performance : • ANOVA Significant (p < 0.007) • Paired t-tests (p < 0.05) : • Am - • EB1 • EB2 • AmCFr • Fr - • EB1 • EB2 Spatialized haptic rendering Realistic Euler-Bernoulli Simplified

  47. Results of qualitative evaluation • “How was the subjective feeling of realism ?” • Rate the impact realism : • Paired t-tests (p < 0.05): • Am • EB1 • EB2 • Fr • AmFr • EB2 • Fr • AmCFr Spatialized haptic rendering Realistic Euler-Bernoulli Simplified

  48. Quantitative evaluation and inversion • Many participants inverted the interpretation of the vibrations Sensed Perceived Normal Inverted Spatialized haptic rendering

  49. Discussion • Global weak inter – subject correlation : • Each subject seems to have his/her own interpretation (inversion or not) • Strong intra – subject consistency : • Subjects seem to be very consistent within his/her interpretation • Several strong inter – subject correlations between models • Several models interpreted the same way • Vibrations can be used to convey impact position information Spatialized haptic rendering

  50. 6DOF Spatialized haptic rendering • Generalize the previous result for 6DOF manipulation : • Virtual beam model Spatialized haptic rendering

More Related