1 / 35

A Framework for Haptic Broadcasting

A Framework for Haptic Broadcasting. Jongeun Cha, Ian Oakley, Yo -Sung Ho, Yeongmi Kim, and Jeha Ryu. Presented by Cong Ly CMPT-820 March 16, 2009. Overview. Introduction Types of Haptic Proposed Framework Content Creation Transmission Viewing & Interactions Implementation

glora
Download Presentation

A Framework for Haptic Broadcasting

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Framework for Haptic Broadcasting Jongeun Cha, Ian Oakley, Yo-Sung Ho, Yeongmi Kim, and JehaRyu Presented by Cong Ly CMPT-820 March 16, 2009

  2. Overview • Introduction • Types of Haptic • Proposed Framework • Content Creation • Transmission • Viewing & Interactions • Implementation • Demonstration

  3. Overview • Introduction • Types of Haptic • Proposed Framework • Content Creation • Transmission • Viewing & Interactions • Implementation • Demonstration

  4. What is Haptic? • What is Haptic? • There is no agreement in the precise definition among researchers. • In this paper Haptic is used to define two sub-categories of feedback • Tactile • Kinesthetic

  5. Motivations • Broadcast programs are generally linear • A begin, middle and end • Entertainment is a multi-billion industry • Consumers are actively seeking for interactive content. • We have the technology • MPEG-4 BIFS (Binary Format for Scenes) • Reachin API - VRML

  6. Overview • Introduction • Types of Haptic • Proposed Framework • Content Creation • Transmission • Viewing & Interactions • Implementation • Demonstration

  7. Passive Haptic • Passive Haptic • No direct interaction

  8. Active Haptic • Active Haptic • Semi-interactions • Tactile and kinesthetic

  9. Types of Haptic • Two types of Haptic Media • Linear and Non-Linear • Linear Haptic • Sequential progression • Human touches, impacts, sounds, etc…

  10. Types of Haptic • Non-linear Haptic • Interactivity, tactile information • Able to feel the surfaces • Dynamic content • Kinesthetic devices • PHANToM

  11. Overview • Introduction • Types of Haptic • Proposed Framework • Content Creation • Transmission • Viewing & Interactions • Implementation • Demonstration

  12. Framework

  13. Content Creation • Audio and Video • Standard video camera • Microphone for audio • Three Approaches for capturing Haptic data • Physical sensors • Modeling tools • Analysis of other associated media

  14. Physical Sensors • Capturing haptic surfaces • Piezoelectric resonance • Touch sensors • Movement data • 3D robotic arm • Accelerometer • Force-torque sensors

  15. Content Creation • Audio and Video • Standard video camera • Microphone for audio • Three Approaches for capturing Haptic data • Physical sensors • Modeling tools • Analysis of other associated media

  16. Modeling Tools • Capturing 3D scenes • 3D scanner to capture objects • ZCam, depth video camera (2.5D) • 3D Modeling tool • K-HapticModel • HAMLAT • Motion capturing

  17. Content Creation • Audio and Video • Standard video camera • Microphone for audio • Three Approaches for capturing Haptic data • Physical sensors • Modeling tools • Analysis of other associated media

  18. Automatic Generation • Automatic Generation • Extract trajectory of object from video • Dr. Greg Mori’s work • SFU Vision and Media Lab

  19. Transmission

  20. MPEG-4BIFS • BIFS (Binary Format for Scenes) • Scenes are encoded and transmitted separately • Local and remote animations • User  Objects interaction • Enables different points of view (3D) • Scenes description • Consist of information about the objects • Time and place • Relations between the objects

  21. MPEG-4 BIFS Proposed extended BIFS nodes

  22. MPEG-4BIFS • BIFS Nodes Content • Store data gathered during Creation • Ie. Piezoelectric sensors, modeling tools • DepthMovie Node • Identical to DepthImage • Added MovieTexture for tactile content

  23. Viewing & Interaction

  24. Viewing & Interaction • Haptic Compositor • Route elements to renderers • Haptic Renderer • Decode objects positions • Generate interaction forces • Tactile Renderer • Decode tactile information • Thermal perception, intensities of tactile

  25. Overview • Introduction • Types of Haptic • Proposed Framework • Content Creation • Transmission • Viewing & Interactions • Implementation • Demonstration

  26. Implementation • Implementation by the authors

  27. Implementation • Components • GPAC Project on Advanced Content • multimedia framework • BIFS Broadcaster • MPEG-4 BIFS • Darwin Streaming Server • Apple's QuickTime Streaming Server • Standard RTP and RTSP protocols • Osmo4 Player • From GPAC framework

  28. Overview • Introduction • Types of Haptic • Proposed Framework • Content Creation • Transmission • Viewing & Interactions • Implementation • Demonstration

  29. Demonstration • Home Shopping Scenario

  30. Demonstration • Movie with Tactile Feeling

  31. Industry • D-Box – http://www.d-box.com • Pneumatic actuated chairs • Used for movies and simulations

  32. Industry • PHANToMSensable Technologies • Developed by a student at MIT

  33. Conclusions • Haptic Media enhances existing multimedia • Such as movies • Haptic can be used in • Surgical Training • Military • Commercial • Proposed Framework is feasible • Tools needed are readily available

  34. That’s all That’s all folks Questions?

  35. References • Jongeun Cha, Ian Oakley, Yo-Sung Ho, Yeongmi Kim, and JehaRyu “A Framework for Haptic Broadcasting,” IEEE Multimedia Magazine • G. M. Krishna and K. Rajanna, “Tactile Sensor Based on Piezoelectric Resonance,” IEEE Sensors Journal, vol. 4, no. 5, 2004, pp. 691-697. • Y. Kim, S. Kim, T. Ha, I. Oakley, W. Woo, and J. Ryu, “Air-Jet Button Effects in AR,” Int’l Conf. Artificial Reality and Telexistence, LNCS 4282, 2006, pp. 384-391. • SFU Visual and Modeling Lab, http://www.cs.sfu.ca/research/groups/VML/index.html • MIT Tech, “Robotic Gripper with Phantom Sensable Technologies,” http://techtv.mit.edu/videos/467-robotic-gripper-with-phantom-sensable-technologies • Sensable Technologies, “PHANToM,” http://www.sensable.com/haptic-phantom-premium-6dof.htm • Raunhofer Institute, “MPEG-4 BIFS Binary Format for Scenes”, http://www.iis.fraunhofer.de/Images/MPEG-4%20BIFS_tcm389-67584.pdf

More Related