1 / 31

Approach

X3D Extension for (Mobile) AR Contents International AR Standards Workshop Seoul, Korea Oct 11-12, 2010 Gerard J. Kim (WG 6 AR Standards Study Group Coordinator) Korea University. Approach. Extensibility to existing frameworks X3D (Scene graph) Because AR is implemented as VR!

saber
Download Presentation

Approach

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. X3D Extension for(Mobile) AR ContentsInternational AR Standards WorkshopSeoul, KoreaOct 11-12, 2010Gerard J. Kim (WG 6 AR Standards Study Group Coordinator)Korea University

  2. Approach • Extensibility to existing frameworks • X3D (Scene graph) • Because AR is implemented as VR! • KML, OpenGIS, … • We need location representation • Generality/Flexibility to accommodate • Different AR platforms (~Platform independence) • Mobile, Desktop, HMD, … • Sensors and devices • Vision based, Marker based, Location based, … • Focused on file format (Scene graph based?) • vs. Contents representation • Machine consumption

  3. Various display types and platforms [R. Azuma, 1997] Camera Display Video Combiner Display Optical Combiner <Video See-through> <Optical See-through> Video Combiner Display Camera <Mobile> <Desktop>

  4. AR/MR Implementation

  5. Various sensing

  6. Various sensing

  7. MR/AR Contents Context: Condition or situation that triggers an augmentation and mixing of real and virtual objects Resource: Raw data or information used for augmentation Content: One or more pairings of contexts and Resources + behaviors (that uses the resources) resource context

  8. Related work • Jung et al. (InstantReality Suite) • Extension of Sensor nodes – Physical contexts • Extension of Viewpoint nodes – Specification of camera parameters • Layers: One layer served as background video • Extension of X3DLightNode: Lighting effects SFImageSenosr : X3DDirectSensorNode { SFImage [in/out] value … SFBool [] out False SFString [] label … } DEF frame SFImageSensor { label “Video Frames” } ROUTE frame.value_changed TO surfaceTex.set_image

  9. Major proposals LC, VC LC VC LC, VC • Extend “View” node: Resolution between “live” camera and virtual • Define “Live” camera node (G. Lee / ETRI) • Not necessarily for “AR” contents (e.g. Video textures) • Parameters set by user • More detailed parameter specification for “View” • Set by user • Routed from “Live” camera node • With possibility of behavioral manipulation • “Routed” from sensor • Camera could be tracked separately • Default: same as the world • Note that view can be relative to anything

  10. Major proposal • Extending movie texture node (for AR background) • Also proposed by G. Lee / Instant Reality • Extend existing virtual “Sensor” nodes • New X3DARNodes for target real object description • ImagePatch, 3DObject, GPSLocation, SingleValue, … • Existing: E.g. Visibility, Proximity, Touch sensor … • New: RangeSensor, UIClickSensor, …

  11. Not included in this proposal • Lighting and Rendering issue • Depth sensing and occlusion effects • Extended point of interest (e.g. path, hierarchical POI) • Platform type specification • e.g. Resolution difference

  12. View (Virtual Camera) X3D (Virtual) WORLD OtherX3D Nodes Movie Texture* ROUTE* AR Node + Sensor Virtualized Physical Contexts AR contents (Real/Physical) Live Camera

  13. Abstraction of MR/AR contents as a collection of context and resources connected by “Event in”’s and “Event out”’s.

  14. <Scene> <Group> <TouchSensor DEF='TOUCH' description='touch to activate'/> <TimeSensor DEF='TIME' cycleInterval='3'/> <PositionInterpolator DEF='INTERP_POS' key='0 0.25 0.5 0.75 1' keyValue='0 0 0 1 0 0 0 0 0 -1 0 0 0 0 0'/> <Transform DEF='BALL'> <Shape> <Appearance> <Material/> </Appearance> <Sphere/> </Shape> </Transform> </Group> <ROUTE fromField='touchTime' fromNode='TOUCH' toField='startTime' toNode='TIME'/> <ROUTE fromField='fraction_changed' fromNode='TIME' toField='set_fraction' toNode='INTERP_POS'/> <ROUTE fromField='value_changed' fromNode='INTERP_POS' toField='translation' toNode='BALL'/> </Scene>

  15. <Scene> <Group> <Marker DEF = “HIRO” enable “TRUE” filename=”C:\hiro.patt”/> <VisibilitySensor DEF='Visibility' enabled=”TRUE”/> <Transform DEF='BALL'> <Shape> <Appearance> <Material/> </Appearance> <Sphere/> </Shape> </Transform> </Group> <ROUTE fromNode=’Vsibility’ fromField='visible' toNode=’BALL’ toField=’visible’ /> </Scene>

  16. Vision based feature recognition and tracking (e.g. fiducials, markers, 3D points) • Non-vision based env. sensor events and values (e.g. RFID, GPS, distance) • User interaction devices events and values (e.g. buttons, touch screen, jog dial) • Context information (e.g. user age)

  17. X3DARNode Placeholders for physical objects within AR/MR world “implementation” X3DARNode : X3DNode { SFNode [in, out] metadata SFNode [in, out] parent SFString [in, out] description SFBool [in, out] enabled } X3DARNode is the base type for the Marker, Location and General Event, …

  18. ImagePatch (Marker) & VisibilitySensor ImagePatch : X3DARNode { SFNode [in, out] metadata SFNode [in, out] parent SFString [in, out] description SFBool [in, out] enabled SFString [in, out] filename SFVec3f [in, out] position SFRotation [in, out] orientation } VisibilitySensor : X3DEnvironmentalSensorNode <!-- Existing --> { SFVec3f [in, out] center SFBool [in, out] enabled SFNode [in, out] metadata SFVec3f [in, out] size SFTime [out] enterTime SFTime [out] exitTime SFBool [out] isActive }

  19. Location & RangeSensor GPSLocation : X3DSensorNode { SFNode [in, out] metadata SFNode [in, out] parent SFString [in, out] description SFBool [in, out] enabled SFInt32 [in, out] device_description SFBool [out] status MFString [out] values } RangeSensor : X3DEnvironmentalSensorNode { SFVec3f [in, out] center SFBool [in, out] enabled SFNode [in, out] metadata SFVec3f [in, out] size SFTime [out] enterTime SFTime [out] exitTime SFBool [out] isActive SFInt32 [in, out] sequence SFString [in, out] lBound SFString [in, out] uBound SFString [in, out] value }

  20. Live camera Live Camera { SFString [in, out] label "default“ SFString [out] parent SFImage [out] image SFMatrix4f [out] projmat "1 0 0 … " SFBool [out] on FALSE SFBool [out] tracking FALSE SFVec3f [out] position SFRotation [out] orientation } • LiveCamera = MR/AR Capture Camera • Within the Scene node • Image field is the out value • Camera internal parameter  projmat field • Camera external parameter  Set to World but can be tracked

  21. Routing from LiveCam • From • Live Camera node “image” field • To • Background (LiveURL field) • Shape (MovieTexture field)

  22. Live video  Background <Scene> <Background groundAngle='1.309 1.571' groundColor='0.1 0.1 0 0.4 0.25 0.2 0.6 0.6 0.6' skyAngle='1.309 1.571' skyColor='0 0.2 0.7 0 0.5 1 1 1 1' backUrl='mountns.png' frontUrl='mountns.png' leftUrl='mountns.png' rightUrl='mountns.png'/> </Scene> <Scene> <LiveCamera DEF='USBCam1' source='dev#'/> <Background liveSource='USBCam1'/> </Scene> <Scene> <Background videoUrl='bgvideo.mpg'/> </Scene>

  23. MovieTexture Node • Add MovieTexture to X3DTextureNode hierarchy • Used for TextureBackground • Fix TextureBackground relative to camera • Allow connection to live camera (not just through streaming server)

  24. MovieTexture Node <Shape> <Appearance> <MovieTexture loop='true'   url=' "wrlpool.mpg" "http://www.web3d.org/x3d/content/examples/Vrml2.0Sourcebook/wrlpool.mpg" '/> </Appearance> <IndexedFaceSet ccw='false' coordIndex='0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16'> <Coordinate point='2.00 0.6 0.00 1.85 0.6 0.67 1.41 0.6 1.41 0.67 0.6 1.85 0.00 0.6 2.00 -0.67 0.6 1.85 -1.41 0.6 1.41 -1.85 0.6 0.67 -2.00 0.6 0.00 -1.85 0.6 -0.67 -1.41 0.6 -1.41 -0.67 0.6 -1.85 0.00 0.6 -2.00 0.67 0.6 -1.85 1.41 0.6 -1.41 1.85 0.6 -0.67 2.00 0.6 0.00'/> </IndexedFaceSet> </Shape>

  25. Live Camera  Movie Texture <Scene> <Shape> <Appearance> <MovieTexture loop='true' url='wrlpool.mpg'/> </Appearance> <IndexedFaceSet ccw='false' coordIndex='0 1 2 ... 15 16'> <Coordinate point='2.00 0.6 0.00 ... 2.00 0.6 0.00'/> </IndexedFaceSet> </Shape> </Scene> <Scene> <LiveCamera DEF='USBCam1' source='dev#'/> <Shape> <Appearance> <MovieTexture liveSource='USBCAM1' keyColor= '0 0 1' /> </Appearance> <IndexedFaceSet ccw='false' coordIndex='0 1 2 ... 15 16'> <Coordinate point='2.00 0.6 0.00 ... 2.00 0.6 0.00'/> </IndexedFaceSet> </Shape> </Scene>

  26. Live Camera and Virtual Camera • Calibrating the virtual camera according to the parameters of live capture camera • Internal parameter = projection matrix • External parameter = camera pose • Manual • Direct specification • Routing • From the Live camera • From the Sensor W0 T

  27. Method 1 Viewpoint : X3DViewpointNode{ SFMatrix4f [in] projmat SFVec3f [in,out] position SFRotation [in,out] orientation SFNode [in,out] liveCamera Add distortion parameters here } <Scene> <LiveCamera DEF='USBCam1' source='dev#'/> <Viewpoint liveCamera='USBCam1'/> <Shape> … </Shape> </Scene>

  28. Method 2 <Scene> <LiveCamera DEF='USBCam1' source='dev#'/> <Viewpoint DEF='MRView'/> <Shape> … </Shape> <ROUTE fromNode='USBCam1' fromField='projmat' toNode='MRView' toField='projmat'/> <ROUTE fromNode='Tracker' fromField='position' toNode='MRView' toField='projmat'/> <ROUTE fromNode='Tracker' fromField='orientation' toNode='MRView' toField='projmat'/> </Scene>

  29. Other Activities • Draft document • Teleconferences with Web3D • Implementation: k-MART • Domestic workshop • April, POSTECH, Pohang, Korea • June, KIST, Seoul, Korea

  30. Future • More • Extensions • Examples • Implementations • International consensus

More Related