1 / 16

Virtual Imaging Peripheral for Enhanced Reality

Virtual Imaging Peripheral for Enhanced Reality. Aaron Garrett, Ryan Hannah, Justin Huffaker , Brendon McCool. Project Overview.

hayden
Download Presentation

Virtual Imaging Peripheral for Enhanced Reality

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Virtual Imaging Peripheral for Enhanced Reality Aaron Garrett, Ryan Hannah, Justin Huffaker, Brendon McCool

  2. Project Overview Our project, code named Virtual Imaging Peripheral for Enhanced Reality or VIPER, is an augmented/virtual reality system. It will track a handheld unit’s location and perspective and use this information to find the location of a camera position in a virtual environment. Through a LCD screen on the handheld unit the user will see the virtual environment at the cameras location as if the handheld unit was a window into the virtual world. As the user moves the handheld unit around a table top sized environment the handheld unit’s actual and virtual perspective changes, allowing for different viewing angles of the virtual space.

  3. Project-Specific Success Criteria • An ability to communicate time stamp data using RF between the base unit and handheld unit. • An ability to display images to the LCD display. • An ability to estimate the angle and position of the handheld unitwith respect to an origin point using accelerometer, gyroscope, compass, visual data, and ultrasonic data. • An ability to find angle displacement of the handheld unit’s front face relative to the IR beacon origin using mounted camera. • An ability to find distance from base to handheld unit using ultrasonic emitter and receiver.

  4. Block Diagram

  5. Beacon Unit – Software Design Considerations • Requires use of • PWM for • LED • Ultrasonic • UART • Xbee • Timer • Control Interrupt • Global Variables • timeToStartPWM_flag • timeToStopPWM_flag • pwmActive_flag

  6. Beacon UnitSoftware FlowChart

  7. Beacon Unit Software Hierarchal

  8. VPChip – Software Design Considerations • Requires use of • I2C (TWI) • Camera Commands • SPI • Transmit IR displacement angle • ISI (Image Sensor Interface) • AIC (Advance Interrupt Controller) • Global Variables

  9. VPChip Software FlowChart

  10. VPChip Software Hierarchal

  11. USB HUB – Software Design Considerations • Requires use of • SPI • IR displacement angle input • UART • Xbee • AtD • Ultrasonic Receiver input • I2C • Sensor Control and Data bus • USB • External Interrupts • Timer • Elapsed Timer • Flags • accelerometerDataReady • gyroscopeDataReady • magnetometerDataReady • atdReady • serviceFlagsQueue • serviceFlagsCount • Variables • newDataFlagArray • dataArray • elapsedTimerCounter • recievingUltrasonic • currentSPIInputByte • tempDataArray • Rs-232 circle buffer • Head/Tail buffer pointers

  12. USB HUBSoftware FlowChart

  13. USB HUBSoftware Hierarchal

  14. Beagle – Software Design Considerations • Requires use of • USB • Graphics/OpenGL • Running ÅngströmLinux distribution • Global Variables • State Vector • External angle quaternion

  15. Beagle Software FlowChart

  16. Beagle Software Hierarchal

More Related