1 / 20

Integrating Active Tangible Devices with a Synthetic Environment for Collaborative Engineering

Integrating Active Tangible Devices with a Synthetic Environment for Collaborative Engineering. Sandy Ressler Brian Antonishek Qiming Wang Afzal Godil. National Institute of Standards and Technology. Jared Freeland DAS FA CIS 4930. Abstract of the Abstract.

karim
Download Presentation

Integrating Active Tangible Devices with a Synthetic Environment for Collaborative Engineering

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Integrating Active Tangible Devices with a Synthetic Environment for Collaborative Engineering Sandy Ressler Brian Antonishek Qiming Wang Afzal Godil National Institute of Standards and Technology Jared FreelandDAS FACIS 4930

  2. Abstract of the Abstract • This paper describes the creation of an environment for collaborative engineering in which the goal is to improve the user interface by using haptic manipulation with synthetic environments. • The system to be outlined combines some of what Dr. Fishwick discussed on Wednesday, as well as Scott’s presentation on “Real Reality”.

  3. Introduction • The immediate goal: to determine the feasibility of using a tangible interface with a multiuser VRML environment as applied to collaborative engineering. • By “tangible” we refer to the ability to pick up and interact with actual physical objects represented in the virtual environment

  4. Introduction • A secondary goal of the project was to use as much off-the-shelf software and hardware as possible, to facilitate transfer of the technology into the commercial world. • The mediation hub is run by Java • The VE uses a commercial system, the blaxxun Community Platform • The tangible devices are off-the-shelf configurable robots by LEGO Mindstorms

  5. System Overview • The overall environment is conceptually simple. • Two collaborating engineers in geographically separate areas wish to manipulate and discuss a construction project • Recent work at NIST has demonstrated that VRML can be used to represent rich construction environments, but manipulation of elements such as a virtual excavator is awkward.

  6. System Overview • Control panels with many buttons and sliders are functional but can be difficult to manipulate. • The answer: direct haptic manipulation should be more intuitive for interaction. • Users move the tangible excavator and adjust the rotatable arm, causing the virtual “mirror” to update.

  7. System Overview • The core of the system is the Java based Virtual Environment Device Integrationserver, or JVEDI, which acts as a hub between all the system’s components. • The server runs as a stand-alone Java application on the host computer.

  8. System Overview The Real Environment • Two work surfaces (A.K.A. tables) • A LEGO Mindstorm robot on each surface • Above each surface is a video camera looking down at the surface, providing 2D position/orientation

  9. System Overview The Virtual Environment • A multi-user blaxxun environment displays the sum of both (or all) physical environments • A simplified user interface consisting of buttons and arrows is included for collaborators without access to an actual robot

  10. Interesting Points of the System • Unlike “graspable” interfaces, this system does not use a haptic glove or data glove of any kind. • Instead, by using a camera to track the movement of robots, the user is given complete unrestricted control of the robots

  11. Interesting Points of the System • The virtual and real environments are kept synchronized. They always mirror each other. • This is accomplished by always using position values reported by the video system.

  12. Integration Issues • The most challenging aspect in creating the work environment was integrating all the processing elements. • Functionality for controlling robots and reading and writing data from a position tracking device had to be built for VRML’s External Authoring Interface. • A fully configured version of the environment requires up to six separate computers.

  13. Major Components Vision Processing • The position and orientation of the LEGO robots are computed in real time using a computer vision method based on color tracking. • The vision program uses an inexpensive camera and can track multiple robots at 10 frames/sec

  14. Major Components Vision Processing • To track the LEGO robots, two differently colored cards were attached to the robot. • The computer vision program uses probability distribution to find the centers of the two squares, the mean of which is the robot’s position. • The orientation is the arctangent of the difference of the centers

  15. Major Components Speech Input • A user who is moving robots cannot easily access a keyboard. • It can also be necessary to move robots on two surfaces simultaneously. • Voice commands were built in such as “forward”, “backward”, “left”, “right”, “select red”, “select blue”, etc.

  16. Major Components Multiuser VRML • The multiuser aspect of the VRML world was accomplished with commercially available software from blaxxun Community. • The first part is the virtual world which includes the two lego robots (red and blue), the floor, and a cylinder. • The second part is the control panel

  17. Auxiliary Processing Collision Detection • Suppose the robots on two separate work surfaces collide virtually • The collision detection is performed by the VRML world, and knowledge of it exists only in the VE. • The robots light up and beep when they hit something in the virtual world.

  18. Auxiliary Processing Task Commands and Recording • Small programmatic tasks were created for the robots, such as movement patterns. • Additionally, functionality was added that allows users to record the movements of the robots, to be played back and repeated later.

  19. Links • The JVEDI code is publicly available athttp://ovrt.nist.gov/jvedi • More on robots at legomindstorms.com • More on the blaxxun Community at blaxxun.com

  20. Discussion How does this project relate or compare to what Scott discussed Wednesday? What are the advantages/disadvantages? What can and needs to be improved?

More Related