1 / 46

Haptic Rendering

Haptic Rendering. Max Smolens COMP 259 March 26, 2003. What is haptics?. Using the sense of touch to interact with computers and virtual environments. What is haptic rendering?. The process of computing and generating forces in response to use interactions with virtual objects.

javier
Download Presentation

Haptic Rendering

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Haptic Rendering Max Smolens COMP 259 March 26, 2003

  2. What is haptics? • Using the sense of touch to interact with computers and virtual environments

  3. What is haptic rendering? • The process of computing and generating forces in response to use interactions with virtual objects

  4. Why use haptics? • Increases the information flow between the computer and the user • Intrinsically bilateral • When we push on an object, it pushes back on us

  5. Why use haptics? (2) • Our sensing of forces is closely tied to our visual system and sense of three-dimensional space • Information and intent can be conveyed in a physically direct and primal way

  6. Haptic Applications • Medicine • Surgical simulators for training • Manipulating robots for minimally invasive surgery • Telemedicine, remote diagnosis • Accessibility for the disabled

  7. Haptic Applications (2) • Entertainment • Video games, simulators that enable the user to feel and manipulate objects in the environment • Education • Feel phenomena at a variety of spatial and temporal scales • Studying complex data sets

  8. Haptic Applications (3) • Industry • CAD systems • Virtual prototyping • Assembly and disassembly can guide final design • Shape sculpting • Expressive, free-form shape generation and modification

  9. Haptic Applications (4) • The arts • Virtual painting, sculpting • Virtual musical instruments

  10. Haptic interaction

  11. Human haptics • Mechanical, sensory, motor and cognitive components • Two classes of sensory information: • Tactile • Kinesthetic

  12. Human haptics (2) • Tactile information • From skin in contact with an object • Spatial and temporal variations of forces within the contact region • Slipping, fine textures, small shapes, and softness

  13. Human haptics (3) • Kinesthetic information • Net forces along with position and motion of limbs • Coarse properties of object • Large shapes, spring-like compliances

  14. Human haptics (4) • Kinesthetic resolution: • 2 degrees for fingers and wrist • 1 degree for shoulder • Force exerted by a finger: • 50 to 100 N maximum • 5-15 N typically during exploration and manipulation

  15. Haptic interfaces

  16. What makes a good interface? • Must work with human abilities and limitations • Approximations of real-world haptic interactions determined by limits of human performance

  17. A good haptic interface • Free motion must feel free • Low back-drive inertia and friction • No motion constraints • Ergonomics and comfort • Pain, discomfort and fatigue will detract from the experience

  18. A good haptic interface (2) • Suitable range, resolution and bandwidth • User should not be able to go through rigid objects by exceeding force range • No unintended vibrations • Solid objects must feel stiff

  19. Haptic rendering • Two parts: collision detection, response

  20. Two types of interactions • Point-based haptic interactions • Only end point of device, or haptic interface point (HIP), interacts with virtual object • When moved, collision detection algorithm checks to see if the end point is inside the virtual object • Depth calculated as distance between HIP and closest surface point

  21. Two types of interactions (2) • Ray-based haptic interactions • Probe of haptic device modeled as a line-segment whose orientation matters • Can touch multiple objects simultaneously • Torque interactions

  22. Collision detection • Detect collisions between haptic probe and virtual objects • Bounding volume hierarchies, spatial partitioning • H-COLLIDE, hybrid technique: • Partition virtual workspace as uniform grid • For each grid cell containing primitives, computes OBBTrees

  23. Simple collision response • Haptic rendering of 3D sphere

  24. Simple collision response (2) • Reaction force calculated using the linear spring law F=kx • k: stiffness of object • x: depth of penetration • Direction of force along surface normal

  25. Penalty methods • Subdivide object and associate each subvolume with a surface • Determine feedback force directly from penetration • Works well for simple geometric shapes

  26. Penalty methods (2) • There are some problems • Two possible paths to reach same location, which path was taken?

  27. Penalty methods (3) • Force summation for multiple objects • Compute net force by adding • Correct for perpendicular surfaces • For obtuse angle, force vector becomes too large • When almost parallel, force vector too large by a factor of 2

  28. Penalty methods (4) • Problems with thin objects • If pushed halfway through an object, will be pulled through the rest of the way

  29. Solution? God-object • Zilles, Salisbury (1995) • Cannot stop HIP from penetrating virtual objects • Define additional variables to represent the virtual location of the haptic interface (god-object, IHIP, proxy)

  30. God-object (2) • In free space, HIP and IHIP are collocated • When HIP moves into an object, the IHIP remains on the surface • IHIP computed such that its distance from the HIP is minimized • Correct force vector is unambiguous

  31. God-object (3) • Infinite surface: • Active if the old IHIP is a positive distance from the surface and the HIP is a negative distance from the surface • Finite extent: • If a line traced from the old IHIP to new HIP passes through the facet, then consider the facet active

  32. God-object (4) • When touching convex portion of an object, only one surface should be active at a time

  33. God-object (5) • When touching concave portion of an object, multiple surfaces can be active • 2 surfaces: constrain IHIP to a line • 3 surfaces: constrain IHIP to a point • IHIP might cross another surface before HIP • Solution: iterate the process, until no new constraints found

  34. God-object (6) • Location computation using Lagrange multipliers • x, y, z: coordinates of IHIP • xp, yp, zp: coordinates of HIP • Constraints added as planes

  35. God-object (7) • Minimize L by setting its six partial derivatives equal to 0, solvable with 65 multiplies and divides

  36. Rendering surface details • Smoothing • Friction • Textures

  37. Force shading • Render objects as smooth and continuous, even if underlying representation is not • Compute force vector for each vertex, interpolate over polygonal surfaces (like Phong shading)

  38. Surface friction • Without friction, virtual objects feel “icy-smooth” • Coulomb friction: sticking and sliding • Forces tangential to surface, direction opposite of motion

  39. Haptic texturing • Force perturbation • Modify the direction and magnitude of the force vector • Max and Becker (1994):

  40. Haptic texturing (2) • Image-based: • Construct texture field from 2D image data • Map heights onto the object surface • Procedural: • Generate synthetic texture fields using mathematical functions

  41. Haptic texturing (3)

  42. Challenges • Graphics update rate must be between 20-30 Hz • Haptic update rate must be around 1kHz • Decouple simulation and haptic loops using multiple processors or a dedicated machine

  43. 6-DOF haptics challenges • Detect all surface contact instead of just at a single point • Calculate a reaction force and torque at every point or region of contact • Maintain the 1kHz refresh rate

  44. Examples

  45. References • Basdogan, C., Srinivasan, M.A. “Haptic rendering in virtual environments.” http://network.ku.edu.tr/~cbasdogan/-Papers/VRbookChapter.pdf • Chen, E. “Six degree-of-freedom haptic system for desktop virtual prototyping applications.” Proc. First International Workshop on Virtual Reality and Prototyping, p. 97-106, 1999. • Gregory, A., Lin, M. , Gottschalk, S. and Taylor, R. “A Framework for Fast and Accurate Collision Detection for Haptic Interaction.” Proc. of the IEEE Virtual Reality (VR 99), p. 38-45, 1999. • Mark, W. et al. “Adding force feedback to graphics systems: issues and solutions.” Proc. ACM SIGGRAPH 1996. • Massie, Thomas H. and Kenneth Salisbury. “The PHANTOM haptic interface: a device for probing virtual objects.” Proc ASME Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 1994. • McNeely, W., Puterbaugh K., and Troy, J. “Six degree-of-freedom haptic rendering using voxel sampling.” Proc. ACM SIGGRAPH 1999.

  46. References (2) • Ruspini, Kolarov and Khatib. “The haptic display of complex graphical environments.” Proc. ACM SIGGRAPH 1997. • Salisbury, J.K. et al. “Haptic rendering: programming touch interaction with virtual objects.” Proc. ACM SIGGRAPH 1995. • Salisbury, J.K. and Srinivasan, M.A. “Phantom-based haptic interaction with virtual objects.” IEEE Computer Graphics and Applications, 17(5), p. 6-10. • Salisbury, J.K. “Making graphics physically tangible.” Communications of the ACM, 42(8), p. 74-81. • Srinivasan, M.A. and Basdogan, C. “Haptics in virtual environments: taxonomy, research status, and challenges.” Computers & Graphics, 21(4), p. 393-404. • Zilles, C.B. and Salisbury, J.K. “A constraint-based god-object method for haptic display.” Proc. IEE/RSJ International Conference on IntelligentRobots and Systems, Human Robot Interaction, and Cooperative Robots, Vol. 3, p. 146-151, 1995.

More Related