1 / 34

Collaborative Mixed Reality Visualization of an Archaeological Excavation

Collaborative Mixed Reality Visualization of an Archaeological Excavation. Hrvoje Benko. Department of Computer Science Columbia University. MIT talk – January 12, 2005. Multidisciplinary Team. Motivation. Excavation is destructive and physically “unreconstructable” process

myersc
Download Presentation

Collaborative Mixed Reality Visualization of an Archaeological Excavation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Collaborative Mixed RealityVisualization of an Archaeological Excavation Hrvoje Benko Department of Computer ScienceColumbia University MIT talk – January 12, 2005

  2. Multidisciplinary Team

  3. Motivation • Excavation is destructive and physically “unreconstructable” process • Need to preserve as much data as possible for analysis • Data interpretation happens off-site • Current tools focus on 2D data and do not incorporate 3D information • Many experts—collaboration is a must!

  4. Archaeological Excavation at Monte Polizzo, Sicily, Summer 2003Ian Morris, Director (Stanford University)

  5. Working in the field!

  6. Meshed 3D Model (13 scans)

  7. We have a 3D site model, now what? Real Virtual

  8. Field Notes Drawings High Resolution Images GIS Data Panoramic Images Videos

  9. Field Notes Drawings High Resolution Images GIS Data Database Panoramic Images Videos 3D Object Models 3D Site Model

  10. Two Problems • How to combine all this data in one seamless environment? • How to make it easy to interact with?

  11. VITA:Visual Interaction Tool forArchaeology • Multiple users • Multiple displays • Projected tabletop • Handheld • High-resolution monitor • See-through head-worn • Multiple interaction devices • MERL DiamondTouch table • EssentialReality P5 gloves • Speech input • 6DOF tracker

  12. Design Considerations • Use the most appropriate display for the given data • Facilitate both human-system and human-human interaction

  13. 3D Modules AR Module DT Module AR Module SCREEN Module HANDHELDModule MessageFacilitator Database 2D Modules Modular Architecture …

  14. AR Module Components Sony Head-Worn Display (LDI-D100B) Head Tracker Hand Tracker (Intersense IS900) Microphone P5 Glove (Essential Reality) DiamondTouch Table Connector (MERL)

  15. Life-size Immersive Exploration

  16. 3D Multimodal Interaction • Provide natural interaction mechanism for our 3D environment • Modalities • Speech: IBM ViaVoice 10 • Gestures: EssentialReality P5 glove • Selection statistics: SenseShapes • Focus on selection • Based on collaboration with Phil Cohen et al. (ICMI 2003) and SenseShapes (ISMAR 2003)

  17. VirtualTray

  18. Desktop Components High-resolution Display Touch-sensitive Projective Display (MERL DiamondTouch) Handheld Display

  19. World-In-Miniature

  20. Harris Matrix

  21. Enhanced Harris Matrix

  22. Enhanced Harris Matrix

  23. Cross-Dimensional Hybrid Gestures • Synchronized 2D and 3D gestures • Facilitate seamless transition across dimensions To appear in IEEE VR 2005

  24. Cross-Dimensional Hybrid Gestures PullPush To appear in IEEE VR 2005

  25. Cross-Dimensional Hybrid Gestures PullPinDragRotate Push To appear in IEEE VR 2005

  26. Cross-Dimensional Hybrid Gestures PullConnectScaleDisconnect Push To appear in IEEE VR 2005

  27. Handheld Focus-in-Context Display • Movable high-resolution inset • Tracked by DiamondTouch • Projection suppressed in its bounds • Physical magic lens

  28. Tabletop Interaction

  29. User Feedback • Overall very positive reaction • Archaeologists benefited from: • Temporal–Spatial connection • Aggregated collection of all data • Accurate 3D model • Simple touch-based interactions • Potential for increased collaboration

  30. Room for Improvement • Reduce wires • Reduce weight • Eye occlusion hinders communication • Missing data: • More objects, features, notes and pictures • More scans during excavation (time-lapse spatial record) • Missing features: • Virtual scale measure (implemented since) • Variable site model scaling • Improved selection in world-in-miniature

  31. Current and Future Work • Larger Site: • Summer 2004 - Thulamela, South Africa • Personalized user experience based on expertise • Environment management

  32. Acknowledgments • Special thanks to: • Peter Allen, Alejandro Troccoli, and Ben Smith (CU Robotics Lab) • Ian Morris and Trinity Jackman (Stanford Archaeology Center) • Lynn Meskell and James Conlon (CU Dept. of Archaeology) • Sajid Sadi and Avinanindra Utukuri (P5 glove) • Shezan Baig • Mitsubishi Electric Research Labs (DiamondTouch table) • Alias Systems • Microsoft Research • Funded in part by NSF ITR Grants IIS-0121239 and IIS-00-82961, and Office of Naval Research Contracts N00014-99-1- 0394, N00014-99-1-0683, N00014-99-1-0249, and N00014-04-1-0005.

  33. Questions?

More Related