1 / 25

Formation et Analyse d’Images Session 7

Formation et Analyse d’Images Session 7. Daniela Hall 7 November 2005. Course Overview. Session 1 (19/09/05) Overview Human vision Homogenous coordinates Camera models Session 2 (26/09/05) Tensor notation Image transformations Homography computation Session 3 (3/10/05)

fountain
Download Presentation

Formation et Analyse d’Images Session 7

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Formation et Analyse d’ImagesSession 7 Daniela Hall 7 November 2005

  2. Course Overview • Session 1 (19/09/05) • Overview • Human vision • Homogenous coordinates • Camera models • Session 2 (26/09/05) • Tensor notation • Image transformations • Homography computation • Session 3 (3/10/05) • Camera calibration • Reflection models • Color spaces • Session 4 (10/10/05) • Pixel based image analysis • 17/10/05 course is replaced by Modelisation surfacique

  3. Course overview • Session 5 + 6 (24/10/05) 9:45 – 12:45 • Contrast description • Hough transform • Session 7 (7/11/05) • Kalman filter • Session 8 (14/11/05) • Tracking of regions, pixels, and lines • Session 9 (21/11/05) • Gaussian filter operators • Session 10 (5/12/05) • Scale Space • Session 11 (12/12/05) • Stereo vision • Epipolar geometry • Session 12 (16/01/06): exercises and questions

  4. Session overview • Kalman filter • Robust tracking of targets

  5. Kalman filter • The Kalman filter ist a optimal recursiv estimator. • Kalman filtering has been applied in areas such as • aerospace, • marine navigation, • nuclear power plant instrumentation, • manufactring, and many others. • The typical problem tries to estimate position and speed from the measurements. Tutorial:http://www.cs.unc.edu/~welch/kalman/kalmanIntro.html G. Welch, G. Bishop: An Introduction to the Kalman Filter, TR 95-041, Univ of N. Carolina, USA

  6. Kalman filter • The Kalman filter tries to estimate the state x of a discrete time controlled process that is governed by the equation • with measurement • Process noise • Measurement noise • Process noise covariance Q and measurement noise covariance R might in practise change over time, but are assumed constant • Matrix A relates the state at time k to the state at previous time k-1, in absence of noise. In practise A might change over time, but is assumed constant. • Matrix B relates the optional control input to the state x. We let it aside for the moment. • Matrix H relates the state to the measurement.

  7. Notations • Measurement • Measurement noise covariance • Process noise • Process noise covariance • Kalman gain

  8. Kalman filter notations A priori state estimate A posteriori state estimate A priori estimate error A posteriori estimate error A priori estimate error covariance A posteriori estimate error covariance

  9. Kalman filter • Goal: find a posteriori state estimate ^xk as a linear combination of an a priori estimate ^xk- and a weighted difference between the actual measurement zk and a measurement prediction H^xk- • The difference (zk – H^xk-) is called innovation or residual • K is the gain or blending factor that minimizes the a posteriori error covariance.

  10. Kalman gain K • Matrix K is the gain that minimizes the a posteriori error covariance. • The equations that need to be minimized • How to minimize

  11. Kalman gain K • One form of the result is • Measurement cov small weights residual heavier • A priori estimate error small weights residual little

  12. Kalman gain K • When the error covariance R approaches 0, the actual measurement zk is trusted more, while the predicted measurement ^xk- is trusted less • When the a priori estimate error covariance approaches 0, the actual measurement zk is trusted less, while the predicted measurement ^xk- is trusted more.

  13. Discrete Kalman filter algorithm • The Kalman filter estimates a process by using a form of feedback control. • The filter estimates the process state at some time and then obtains feedback in form of noisy measurements. • The Kalman filter equation fall in 2 groups • time update equations • measurement update equations • Time update equations project forward in time the current state and error covariance estimates to obtain a priori estimates. • The measurement update equations implement feedback. They incorporate a new measurement into the a priori estimate to form an improved a posteriori estimate.

  14. Kalman filter algorithm Time Update (« Predict ») Measurement update (« Correct ») The time update projects current state estimate ahead in time. The measurement update adjusts the projected estimate by an actual measurement.

  15. Time update equations (predict) Project state and covariance estimates forward in time Measurement update equations (correct) Compute Kalman gain K Measure process zk Compute a posteriori estimate xk Compute a posteriori error covariance estimate Pk Kalman filter Initial estimates for ^xk-1 and Pk-1

  16. Filter parameters and tuning • R: measurement noise covariance can be measured a priori to the filter operation (off-line) • Q: process noise covariance. Can not be measured, because we can not directly observe the process we are measuring. If we choose Q large enough (lots of uncertainty), a poor process model can produce acceptable results. • Parameter tuning: We can increase filter performance by tuning the parameters R and Q. We can even use a distinct Kalman filter for tuning. • If R and Q are constant, the estimation error cov Pk and the Kalman gain Kk will stabilize quickly and stay constant. In this case, Pk and Kk can be precomputed.

  17. Session overview • Kalman filter • Robust tracking of targets

  18. Robust tracking of objects List of predictions Predict Detection List of targets Correct Measurements Trigger regions New targets Detection

  19. Robust tracking of objects • Measurement • State vector • State equation • Source: M. Kohler: Using the Kalman Filter to track Human interactive motion, Research report No 629/Feb 1997, University of Dortmund Germany

  20. Robust Tracking of objects • Measurement noise error covariance • Temporal matrix • Process noise error covariance • a affects the computation speed (large a increases uncertainty and therefore the search regions)

  21. Form of the temporal matrix A • Matrix A relates a posteriori state estimate ^xk-1 to the a priori state estimate ^xk- • The new a priori state estimate requires the temporal derivative • According to a Taylor serie we can write

  22. Kalman filter notations A priori state estimate A posteriori state estimate A priori estimate error A posteriori estimate error A priori estimate error covariance A posteriori estimate error covariance

  23. Time update equations (predict) Project state and covariance estimates forward in time Measurement update equations (correct) Compute Kalman gain K Measure process zk Compute a posteriori estimate xk Compute a posteriori error covariance estimate Pk Kalman filter Initial estimates for ^xk-1 and Pk-1

  24. Example • A 1D point moves with a certain speed on a continuous scale • We have a sensor that gives only integer values • Compute a Kalman filter for the process.

  25. Example results p 2.2 6.4 10.6 14.8 19 true position p’ 4.2 4.2 4.2 4.2 true speed z 2 6 11 15 19 measured pos z’ 4 5 4 4 measured grad ^x 2 5.6 10.9 15.1 estimated pos ^x’ 0 4.2 5.3 4.2 estimated grad K and P converge quickly

More Related