1 / 36

Luis Mejias , Srikanth Saripalli , Pascual Campoy and Gaurav Sukhatme

Luis Mejias , Srikanth Saripalli , Pascual Campoy and Gaurav Sukhatme. Visual Servoing of an Autonomous Helicopter in Urban Areas Using Feature Tracking presented by Wen Li. Outline. Introduction Related work Testbed Visual preprocessing Control Architectures Experiments

travis
Download Presentation

Luis Mejias , Srikanth Saripalli , Pascual Campoy and Gaurav Sukhatme

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Luis Mejias, SrikanthSaripalli, PascualCampoy and GauravSukhatme Visual Servoing of an Autonomous Helicopter in Urban Areas Using Feature Trackingpresented by Wen Li

  2. Outline • Introduction • Related work • Testbed • Visual preprocessing • Control Architectures • Experiments • Conclusion

  3. Outline • Introduction • Related work • Testbed • Visual preprocessing • Control Architectures • Experiments • Conclusion

  4. Introduction • Goal: • vision-guided autonomous flying robots • Application: • Law enforcement, search and rescue, inspection and surveillance • Technique: • Object detection, tracking, inertial navigation, GPS and nonlinear system modeling

  5. Introduction • In this paper: • Two UAVs – Avatar and COLIBRI • Visual tracking => control commands

  6. Outline • Introduction • Related work • Testbed • Visual preprocessing • Control Architectures • Experiments • Conclusion

  7. Related Work • Hummingbird (A. Conway, 1995) • Model-scale • Use GPS only • 4 GPS antennas • Precisions: position 1cm attitude 1 degree

  8. Related Work • AVATAR (Jun, 1999) • Onboard INS & GPS • Kalman Filter for State Estimation • Simulation

  9. Related Work • Vision-guided Helicopter (Amidi, 1996, 1997) • Onboard DSP-based vision processor • Combine GPS and IMU data

  10. Related Work • Vision-augmented navigation system (Bosse, 1997) • Uses vision in-the-loop to control a helicopter • Visual odometer (Amidi, 1998) • A notable vision-based technique used in autonomous helicopter • (Wu, et al, 2005) • Vision is used as additional sensor and fused with inertial and heading measurements for control

  11. Outline • Introduction • Related work • Testbed • Visual preprocessing • Control Architectures • Experiments • Conclusion

  12. Autonomous Helicopter Testbed • AVATAR • Gas-powered radio-controlled model helicopter • RT-2 DGPS system provides positional accuracy of 2 cm • ISIS-IMU provides rate information to onboard computer, which is fused using a 16 state Kalman filter • Ground station: a laptop to send high-level control commands and differential GPS corrections • Autonomous flight is achieved using a behavior-based control architecture

  13. Autonomous Helicopter Testbed • COLIBRI • Gas powered model helicopter • Fitted with a Xscale based flight computer augmented with GPS, IMU, Magnetometer, fused with a Kalman filter • VIA mini-ITX 1.25 GHz computer onboard with 512 Mb RAM, wireless interface and a firewire color camera • Ground station: a laptop to send high-level control commands, and for visualization

  14. Outline • Introduction • Related work • Testbed • Visual preprocessing • Control Architectures • Experiments • Conclusion

  15. Visual Preprocessing -- AVATAR • Image segmentation and thresholding • Convert the image to grayscale • Use the value of “target color” as threshold • Segment the image to binary image where the object of interest is represented by 1’s and background with 0’s

  16. Visual Preprocessing -- AVATAR • Square Finding • Find contours (represented by polylines) from the binary image • Use an algorithm to reduce the points in polylines • Result: simplified squares

  17. Visual Preprocessing -- AVATAR • Template Matching • User selects a detected window (a target)from the GUI • A patch is selected around the location of the target • Use local search window to find best match between the target and the detected contours, deciding which window to track

  18. Visual Preprocessing -- AVATAR • Kalman Filter • Once a suitable match is found, a Kalman filter is used to track the feature positions • Input: x and y coordinates of the features • Output: estimates of these coordinates in the next frame

  19. Visual Preprocessing -- COLIBRI • The user selects the object of interest from the GUI • The location of the object is used to generate visual reference

  20. Visual Preprocessing -- COLIBRI • Lateral visual reference

  21. Visual Preprocessing -- COLIBRI • Vertical Visual Reference

  22. Outline • Introduction • Related work • Testbed • Visual preprocessing • Control Architectures • Experiments • Conclusion

  23. Control Architectures -- AVATAR • A hierarchical behavior based control architecture • Output of Kalman filter is compared with desired values to give an error signal to controller

  24. Control Architectures -- COLIBRI • Controller is based on a decoupled PID control

  25. Outline • Introduction • Related work • Testbed • Visual preprocessing • Control Architectures • Experiments • Conclusion

  26. Experimental results • At Del Valle Urban Search and Rescue Training site in Santa Clarita, California • AVATAR, four trials • First, the helicopter is commanded to fly autonomously to a given GPS waypoint • As soon as it detects the featured window, the controller switches from GPS-based to vision-based control

  27. Location of the features in the image

  28. Helicopter position in meters. (left figure) vertical axis– easting (right figure) vertical axis – northing

  29. Experimental Results • At ETSII Campus in Madrid, Spain • COLIBRI • Seven experimental trials on two different days

  30. Velocity references (vyr) with the helicopter velocity (vy) Lateral displacement (east)

  31. Velocity references (vzr) with the helicopter velocity (vz) altitude displacement (down)

  32. Helicopter displacements during the entire flight trial

  33. Video demonstration • colibrivideoWeb.wmv

  34. Outline • Introduction • Related work • Testbed • Visual preprocessing • Control Architectures • Experiments • Conclusion

  35. Conclusion -- Authors • Demonstrated an approach to visually control an autonomous helicopter: use visual algorithm to command UAV when GPS has dropouts • Experimentally demonstrated by performing vision-based window tracking tasks on two different platforms at different locations and different conditions

  36. Conclusion -- Personal • The topic is interesting • Visual algorithm is demonstrated effective in the experiments • But… the writing is so ugly. • Poor explanation • features, template and matching • Incomplete explanation of figures

More Related