1 / 1

Toward Learning about the Controllability of Containers and their Contents

Toward Learning about the Controllability of Containers and their Contents. Shane Griffith and Alexander Stoytchev, Developmental Robotics Lab. {shaneg and alexs}@iastate.edu http://www.ece.iastate.edu/~shaneg/. 1. Motivation. 2. Experimental Setup. 3. Data Collection and Feature Extraction.

faraji
Download Presentation

Toward Learning about the Controllability of Containers and their Contents

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Toward Learning about the Controllability of Containers and their Contents Shane Griffith and Alexander Stoytchev, Developmental Robotics Lab {shaneg and alexs}@iastate.edu http://www.ece.iastate.edu/~shaneg/ 1. Motivation 2. Experimental Setup 3. Data Collection and Feature Extraction 4. Methodology Upper-torso robot Overview Robotic control of containers and their contents has challenged roboticists for a number of years. Changes in sound and color occur after a specific delay in time from the robot’s sudden movements. Objects that are directly grasped move with the robot more than objects that are inside containers. Range of motion Two Barrett WAMs The robot can learn from the mutual information between the time of the robot’s sudden movement and the following delay in time of these events. Container manipulation is traditionally a control problem. A sudden change in the robot’s position will shift the contents of a container, produce sound, and create anomalies in the visual movement patterns of the objects. Events in the multimodal data are marked with red lines. The robot produced these events during the wave behavior and then learned about object controllability from the data it captured using multiple modalities. Video: Blue Histogram Sensory Modalities Kemp & Edsinger, 2006 Edsinger & Kemp, 2007 Okada, et al; 2009 Video Audio Proprioception Movement Detection Using Optical Flow Event delay Event delay Event delay Event delay Event delay Event delay The result of using optical flow to detect movement during one execution of the wave behavior. The masked images show regions that moved with the robot. Thresholded Optical Flow Audio: Spectrogram Feddema et al; 1997 Yano et al; 2001 Masked Color Image Little work has addressed how robots can learn about the controllability of arbitrarily sized containers and their contents. Objects 10 Containers 5 Blocks Event delay Event delay Event delay Event delay Event delay An algorithm does exist for solving this problem because some children demonstrate it every chance they get. Proprioception: Acceleration Feature Extraction Video: Blue Histogram Cookie monsters. Retrieving a jar of cookies from the cupboard and fetching a freshly poured glass of milk from the table The features extracted from the robot’s three sensory modalities after six consecutive executions of the wave behavior. Event delay Event delay Event delay Event delay Event delay Event delay Video: Green Histogram 10 Non-containers Wave Start Wave Start Wave Start Wave Start Wave Start Wave Start 5. Continuing Work Video: Red Histogram Infants learn how to control many different types of containers over a period of development. The robot may be able to learn the mutual information of the event delays from the different sensory modalities. At 9 months infants begin to insert objects into containers. Their container play peaks at 15 months [Largo & Howard, 1979]. The robot could learn to predict that an event is about to occur in one modality if an event already occurred in a different modality. Audio: Spectrogram Behaviors Grasp Wave Reset Setup Position Hand Drop Block The robot could learn that the presence or absence of certain events in each of the robot’s sensory modalities is determined by whether or not a block is inside a container. Proprioception: Acceleration Robots could learn about containers like infants do. The robot may even be able to learn a behavior—grounded visual representation of the inside spatial relationship based on when certain events occur. Research question: Can robots learn about the controllability of containers and their contents by interacting with objects and identifying patterns in multimodal events? The visual model would be useful for identifying whether or not a block fell inside a container when dropped.

More Related