1 / 1

CS 5 for all!

CS 5 for all!. Bridgette Eichelburger ’ 14, David Lingenbrink ‘ 14, Yael Mayer ‘ 11, Obosa Obazuaye ‘ 14, Becca Thomas ‘ 14 , Maia Valcarce ‘13, Joshua Vasquez ‘14, Garrett Wong ‘14, Tim Yee ‘13, Christine Alvarado, Zachary Dodds, Geoff Kuenning, and Ran Libeskind-Hadas. Improving CS 5.

lew
Download Presentation

CS 5 for all!

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS 5 for all! Bridgette Eichelburger ’14, David Lingenbrink ‘14, Yael Mayer ‘11, Obosa Obazuaye ‘14, Becca Thomas ‘14, Maia Valcarce ‘13, Joshua Vasquez ‘14, Garrett Wong ‘14, Tim Yee ‘13, Christine Alvarado, Zachary Dodds, Geoff Kuenning, and Ran Libeskind-Hadas Improving CS 5 MyCS: Middle School CS 5 Python, plus… Ground and aerial robots offer complementary strengths. Flying vehicles offer novel sensing vantagepoints, while wheeled platforms carry sensors with higher power demands; they also permit more accurate odometry and control. This project focused on the aerial facets of coordinated ground/aerial autonomy. We used off-the-shelf hardware and Willow Garage’s freely available ROS software, including several project-specific contributions. The demonstrations and algorithms at left were implemented using sliding-scale autonomy, so that a human can intervene – if necessary or desired -- at almost any step of the process. Hopping interface (Brad/Lilian) Galileo picture (Nick) Tagless Localization? Hopping interface (Brad/Lilian) Ideally, the April Tags and other markers used in these tasks could be partially or wholly replaced with uncontrived image cues. We explored the system’s ability to localize itself within a visual map using the drone’s forward-facing camera. Hardware This project explored the capabilities of the inexpensive ($300) Parrot ARDrone, a quadrotor helicopter. The iRobot Create was the foundation for our ground robots. The ARDrone is sensitive to its surroundings, and requires careful scaffolding to succeed in making deliberate movements autonomously. The Create offers an extensible base for a Kinect, laptop, and shelf. Hula-hoop hopping: (right) a two-node graph of locations indicated by April tags within Hula Hoops. (left) our visualization of a larger graph of locations Drone picture (Nick) SURF table of data or graphs/bar charts/etc – some quantitative comparison… The “control-panel / SURFmaster interface (Brad/Lilian) Create with Kinect picture (Josh/David) April tags output picture with pose and id # – (Kim/Malen) SURF feature matching: two pairs and their respective scores (Brad/Lilian) A summary of the interfaces used and the accuracy attained with SURF-based (tagless) localization. (top) the $300 ARDrone (bottom) Mudd’s ground vehicle, based on Michael Ferguson’s and Willow Garage’s designs Outreach Image matching: (left) April Tags incoporated into ROS (right) OpenCV’s SURF features This project’s resources are accessible enough to support many research and educational objectives. Western State colleagues joined us to develop Kinect-based control of their drone and Create. Interactive demos for visitors offer hands-on insight into the challenges underlying aerial autonomy. Software We use Willow Garage’s Robot Operating System, or ROS, for its many drivers, OpenCV vision library, and communications. We contributed a Python-based ROS ARDrone driver with support for both of the copter’s cameras. In addition, OpenCV made it easy to create a Python-based video-processing module that supports live and captured streams equally well. The navigation and localization routines used ROS’s C++ APIs and interface. Factorial finding (Kim/Malen) Outreach picture (Zach) detail Picture from GCER (Zach) detail Wall segmentation (Lilian/Brad) Architecture diagram (white background) Brad/Lilian ROSsified flying Acknowledgments Mutual support: (left) An iRobot Create leads an ARDrone upto an obstacle; the drone’s overhead view then provides the correct direction to navigate (right) factorial-finding and image segmentation This project both leveraged and contributed to ROS’s vision+control codebase. We gratefully acknowledge support from HHMI, The Rose Hills Foundation, the NSF CPATH project #0939149, and funds provided by Harvey Mudd..

More Related