1 / 24

Use of Virtual Reality for Teleoperation of Autonomous Vehicles

Use of Virtual Reality for Teleoperation of Autonomous Vehicles. Michael A. Steffen Jeffrey D. Will Noriyuki Murakami. 2007 National Conferences on Undergraduate Research April 12-14, 2007. About the Authors. Michael A. Steffen BSME from Valparaiso University, May 2007

Download Presentation

Use of Virtual Reality for Teleoperation of Autonomous Vehicles

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Use of Virtual Reality for Teleoperation of Autonomous Vehicles Michael A. Steffen Jeffrey D. Will Noriyuki Murakami 2007 National Conferences on Undergraduate Research April 12-14, 2007

  2. About the Authors • Michael A. Steffen • BSME from Valparaiso University, May 2007 • BSEE from Valparaiso University, May 2007 • Dr. Jeffrey D. Will • Assistant Professor of Electrical and Computer Engineering • Dr. Noriyuki Murakami • Japans National Agriculture Research Center

  3. Location of Tests 10,000 Km

  4. The Vehicle • Global Position Sensor (GPS) • Hydrostatic transmission • Fiber optic gyro sensor • Wireless Network • On board Computer • Camera

  5. Introduction to Teleoperation • Ability to control a vehicle from a remote location • The result is removing the operator from the vehicle • How does Teleoperation fit into the future of vehicle automation? • Fleet management • Human control for special cases not covered by automation

  6. Current System (2D) • Provides location of vehicle • Birds eye view • Uses satellite images • Provides control of vehicle implements and movement

  7. Targeted Improvements • Provide the operator with more information about the environment and vehicle location • Move to a 3-Dimensional virtual environment • Further detail about land contours and object shape and locations • Large display system • View environment and vehicle in 1:1 scale • Stereo vision • Give the operator the feeling of being on the site of the vehicle

  8. Virtual Reality System at Valparaiso University Scientific Visualization Laboratory (SVL) • VisBox-X2 • 12’ x 9’ screen • Passive Stereo • wireless head tracker • 6 DOF tracked input device

  9. Method Used • Need computer models for visualization • Environment model • Vehicle model • Vehicle Model • Contains details of the vehicle

  10. Environment Model • Contains details of Japans National Agriculture Research Center • Texture Map applied to the virtual ground floor was created from Google Maps • Building and landscaping objects were placed in the model using Google Maps for location (2048 X 2048)

  11. Creating a Virtual Environment • OpenSceneGraph (OSG) is a 3-D computer graphics toolkit allowing stereo-vision visualization • OSG allows for models to be loaded and visualized in a VR environment • A transform matrix is applied to the vehicle model allowing for transformations and rotations

  12. Heading Calculations • Using previous and current position • Problems when vehicle turned about its center • Using fiber optic gyro sensor on vehicle and include in communication protocol

  13. Communication Protocol • Transmission Control Protocol (TCP) • Allows the Internet to be used as the network

  14. User Input

  15. Tests and Results • Tests were set up to test specific functions along the design • Visualization • Implement Control • Drive • High Level Functions • Field Test

  16. Visualization Test • Tests visualization of vehicle in VR environment • Test route was planed (See picture) • Vehicle was driven by an onboard operator

  17. Results from test • Communication protocol is working • Location of the vehicle is visually correct • Stereo-vision model is working

  18. Implement Control Test • Test communication to the vehicle • Test operation of • Left/Right blinker • Horn • Engine On/Off • Biter On/Off

  19. High Level Function Test • GoTo Function • Specify X,Y location from base point • Vehicle automatically travels to location

  20. Results from GoTo Function • Vehicle did not reach target location • Vehicle received correct location and began traveling in target direction • Vehicle would stop half way to target location

  21. Driving Test • Tests control of vehicle from SVL • Test determined that heading is not reliable when calculated from position

  22. Field Test • Drive to a specified point and back • Field tests • Unable to perform due to snow condition in fields

  23. Conclusion • This research has demonstrated the feasibility of real-time teleoperation of a semi-autonomous vehicle • Immersive environment allows for • Increased sense of realization • More accurate control

  24. Thank You For Your Time Questions?

More Related