1 / 19

Caught in Motion

Caught in Motion. By: Eric Hunt-Schroeder EE275 – Final Project - Spring 2012. Kalman Filter. Kalman Filter Equations Summary. U n – Control Vector, magnitude of any control system’s or user’s control on situation Z n – Measurement Vector, real-world measurement received

paulos
Download Presentation

Caught in Motion

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Caught in Motion By: Eric Hunt-Schroeder EE275 – Final Project - Spring 2012

  2. Kalman Filter

  3. Kalman Filter Equations Summary

  4. Un – Control Vector, magnitude of any control system’s or user’s control on situation • Zn – Measurement Vector, real-world measurement received • Xn – newest estimate of current true state • Pn – newest estimate of the average error • A – State Transition Matrix • B – Control Matrix • H – Observation Matrix • Q – Estimated process error covariance • R – Estimated Measurement error covariance

  5. Example • Lets suppose we fire a tennis ball at a 45˚ angle with a velocity of 100 m/s.We take measurements from a camera inside the tennis ball. This camera acts as our sensor taking measurements of position at the same time step (∆t). The camera adds error to our position measurements. Velocity in the x and y directions are known exact throughout the example. • We then have error in position but no error in velocity.

  6. Expected Results using Newtons Kinematic Equations

  7. Kinematic Equations • x(t) = x0 + V0xt x direction position • Vx(t) = V0x velocity in x direction, Assumed Constant • y (t) = y0 + V0yt - (1/2)gt2y direction position • Vy(t) = V0y – gtvelocity in y direction • Where: x0 is the initial displacement and g is the acceleration due to gravity (i.e. ≈ 9.81m/s2) • ∆t represents a time step of 1

  8. Converting to a Recurrence Relation, discrete time • xn = xn-1 + Vxn-1∆t x direction position • Vxn = Vxn-1 velocity in x direction • yn = yn-1 + Vyn-1∆t - (1/2)g∆t2y direction position • Vyn = Vyn-1 - g∆tvelocity in y direction • g is the acceleration due to gravity (i.e. ≈ 9.81m/s2) • ∆t represents a time step of 1

  9. Putting into Matrix form

  10. Giving our Kalman Filter some initial information:

  11. Simulation Results

  12. Introduction – Why do we want to do Motion Tracking? • Track and detect objects moving across a given space • Location and Identification of an object - detection of a robbery in bank, car crash in intersection • Want to know behavior of an animal being tested on in the lab

  13. Object Detection - Process • We must differentiate between what is the foreground and background image. • We assign each pixel of an image a distribution of typical values -> our background image. • The background should be constantly updated over subsequent frames.

  14. Problems with Outlier-Detection & algorithm • Drastic changes to our background – no easy fix • Gradual changes to the background on the other hand can be fixed. We let our background be an accumulation of previous backgrounds. This allows for minor changes to take place such as a change from sunlight to dusk. • Time for an example!

  15. Example – Background VS Foreground Foreground Image: Pedestrian detected Background image: with a pedestrian passing through (outlier) On the left we see our frame with a background image already developed. We also notice an object, in this case a pedestrian, has entered the frame. This pedestrian causes a drastic change in the pixels we expected and we notice that the frame on the right has detected this change, shown vividly in white (foreground image). This process is completed by lumping large connected foreground regions into blobs, allowing one to detect an object.

  16. object detected incorrectly Foreground Image: Pedestrian detected – Unwanted blob shown Background image: with a pedestrian passing through (outlier) We notice an unwanted blob is shown in the foreground image. When we evaluate the original image we see that this was a street light and may have been caused by excessive winds producing an outlier to be detected incorrectly. With minor errors in object detection our main goal still remains MOTION TRACKING -> this involves creating a series of object blobs together across successive image frames, this is commonly referred to as blob tracking.

  17. Step 2: Object Tracking • We are able to track an object with a bounding box by estimating the trajectory of two (x,y) coordinates at opposite corners using the Kalman Filter • This is quite similar to the example of tracking a tennis ball except we now keep track of two points

  18. Another problem…Occlusion • Another difficulty faced with this motion tracking algorithm is that of occlusion. • When two objects pass each other we lose track of the object. • Future algorithms and research may learn how to better deal with this common problem.

  19. Sources/References • http://en.wikipedia.org/wiki/K-means_clustering • http://greg.czerniak.info/node/5 • http://www.cs.berkeley.edu/~flw/tracker/ • http://www.cs.ubc.ca/~murphyk/Software/Kalman/kalman.html • http://www.mathworks.com/help/toolbox/control/ref/kalman.html

More Related