1 / 15

Dynamic Color Recognition for the Aibo

Dynamic Color Recognition for the Aibo. Nick Lahens David Puehn. What we did. Our goal was to develop the AIBO’s ability to learn and recognize color We investigated Tekkotsu’s built-in vision algorithms and classes

sorcha
Download Presentation

Dynamic Color Recognition for the Aibo

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Dynamic Color Recognition for the Aibo Nick Lahens David Puehn

  2. What we did • Our goal was to develop the AIBO’s ability to learn and recognize color • We investigated Tekkotsu’s built-in vision algorithms and classes • Using these vision classes we added functionality to allow the AIBO to learn colors dynamically

  3. AIBO Vision • AIBO sees color in YUV • YUV defines the luminance and chrominance of a color • Tekkotsu allows us to interface with the AIBO’s vision by sending us events from the raw camera • These events provide us with image data such as pixel values, image resolution and layers

  4. Why? • Robot vision is extremely sensitive to changes in the environment • Color values in the real world are different from those in ideal lab conditions • For example, a robot might see the color blue differently depending upon the time of day. This doesn’t even include problems arising from the attributes of the color’s physical surface, such as reflectivity. • To counter this problem, we sought to develop code capable of re-teaching AIBO colors in new lighting conditions

  5. The ColorRecog Architecture • Consists of two main components • Color-Learning • Color-Recognition • These two components communicate through Color structs containing… • The color’s YUV values • A unique ID • A threshold

  6. Perceiving Colors • We begin by capturing an image from the AIBO’s camera • Next, we calculate the average Y, U and V values by iterating through each pixel in the image • Finally, we return the average color of the image

  7. PerceiveColor Algorithm foreach pixel in image.Y_channel sumY += pixel.value avgY = sumY / numPixels foreach pixel in image.U_channel sumU += pixel.value avgU = sumU / numPixels foreach pixel in image.V_channel sumV += pixel.value avgV = sumV / numPixels color.Y = avgY color.U = avgU color.V = avgV return color

  8. Learning Colors • First, we perceive the current color viewed by the AIBO • Requires the Aibo’s view be filled completely with the desired color for the best results. • Next, we assign the color a unique ID • Used to differentiate between different color structs • Finally, we add the color to a collection of learned colors • If the color already exists, update its values

  9. Recognition of Colors • We begin again by perceiving the current color viewed by the AIBO • Then we compare this color to each color we have learned thus far • The threshold property provides a level of tolerance for slight environmental variations • If we have a match, we return the color’s ID. • Else, we return the No-Color-Found ID

  10. Demo

  11. Learning and Recognition Stimulus Recognize color Learn Yellow Learn Blue

  12. Conclusion • Using ColorRecog allows the AIBO to effectively adapt to the ever changing visual environment • Dynamic color recognition allows someone to teach a robot colors without having to modify any code

  13. A Better Approach • Calculate the average value from the largest color segment • Allow Aibo to perceive and learn colors in situations where view is not completely saturated with a single color • Only compare U and V values since they are the real determiners of color while luminance is brightness

  14. Learning Colors - Usage • Activate the ColorBehavior • Place an object such that it takes up the Aibo’s entire field of vision • Give the Aibo a stimulus mapped to the desired color • Behavior adds color’s name to struct and stores it in color array

  15. Perceiving Colors • PerceiveColor retrieves representation of Aibo’s vision as three, 2-dimm arrays • Each array represents a different vision channel (Y, U or V) • Each element is a pixel’s value for the given vision channel • Mean values for all pixels in each channel are calculated • Yields average YUV values for entire image • Returns Color struct containing the YUV values

More Related