1 / 18

Primer on tracking

Primer on tracking. Sen-ching S. Cheung March 26, 2004. Track Maintenance. Sensor Data Processing. Gating Computations. An object tracking system. Data association. Prediction and Update. Outline. Prediction and Update Tracking for point targets or segmented objects

kueng
Download Presentation

Primer on tracking

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Primer on tracking Sen-ching S. Cheung March 26, 2004

  2. Track Maintenance Sensor Data Processing Gating Computations An object tracking system Data association Prediction and Update

  3. Outline • Prediction and Update • Tracking for point targets or segmented objects • Tracking for unsegmented objects (feature tracking) • Mean-shift tracking • Data association in multi-object tracking • Global nearest neighbor • Joint Probabilistic Data Association • Multiple Hypothesis Tracking

  4. Prediction and Update • Assume single object tracking • Given: Track observations (e.g. positions): y1, y2,...,yt New observations at time t+1: z1, z2, ..., zN • Goal : Which z’s should be the new yt+1? • Answer: Maximum A Posteriori or Bayesian yt+1 = arg maxi=1,...,N P(zi | y1, y2, ..., yt) or yt+1 = iziP(zi | y1, y2, ..., yt) / N

  5. State-space model • Key is to compute P(yt+1=z |y1,y2,...,yt) • Introduce state Xt: • Given: • Dynamics: P(Xt+1|Xt) • Measurement: P(Yt|Xt) • Markov assumption: P(yt+1|xt+1,y1,y2,...,yt) = P(yt+1|xt+1) • Why? P(Xt+1|Xt) Xt-1 Xt+1 Xt P(Yt|Xt) Yt-1 Yt Yt+1

  6. Prediction, time and measurement update Ans: Simple recursion to compute P(yt+1= z | y1,y2,...,yt) P(xt+1 | y1,...,yt) = ∫P(xt+1,xt | y1,...,yt) dxt = ∫P(xt+1|xt) P(xt | y1,...,yt) dxt P(xt+1 | y1,...,yt+1) = P(xt+1,yt+1| y1,...,yt) / P(yt | y1,...,yt)  P(yt+1| xt+1) P(xt+1| y1,...,yt) Time Update Measurement Update P(xt|y1,...,yt) P(xt+1|y1,...,yt) P(xt+1|y1,...,yt+1) Prediction P(yt+1= z |y1,...,yt) = ∫P(yt+1= z,xt+1 | y1,...,yt) dxt+1 = ∫P(yt+1= z |xt+1,y1,...,yt) P(xt+1 | y1,...,yt) dxt+1 = ∫P(yt+1= z |xt+1) P(xt+1 | y1,...,yt) dxt+1 P(yt+1 = z | y1,...,yt)

  7. Kalman Filter • Linear System and Gaussian Noise xt+1 = Axt+ Gwt, wt~N(0,Q)  xt+1|xt ~ N(Axt, GTQG) yt = Cxt + vt, vt~N(0,R)  yt|xt ~ N(Cxt, R) • Time update t+1|t  E(xt+1|y1,...,yt) = At|t t+1|t  E(xt+1|y1,...,yt) = ATt|t A + GTQG • Prediction E(yt+1|y1,...,yt) = Ct+1|t Cov(yt+1|y1,...,yt) = CTt+1|tC + R • Measurement update t+1|t+1 = t+1|t + Kt+1(yt+1-Ct+1|t ) t+1|t+1 = t+1|t – Kt+1C  t+1|t where Kalman gain, Kt+1  t+1|tCT(C t+1|tCT+R)-1

  8. Simple example : constant velocity Dynamics model : at = white noise  E(at)=0; E(atas)=kδ(s-t) Kalman Filter Implementation: xt = (pt p’t)T;A= 1 T  where T is the sampling period 0 1  G = I; Q = T3/3 T2/2  by computing Cov(xt,xt+1) T2/2 T  C = [1 0]; R depends on measurement error Other types of models: Singer Acceleration, Constant Acceleration, Piecewise constant wiener process acceleration, Coordinated turn, etc. Adaptation: Use multiple KF with HMM : Interactive Multiple Model (IMM) Filtering ^ ^

  9. Non-linearity • Possible in measurement and/or dynamics Measurement : yt = tan-1(xt(2)/xt(1)) + v • Incorporate non-linearity in computing mean & covar. xt(2) yt xt(1)

  10. Extended Kalman Filter • Taylor Series Expansion • Linearization

  11. Unscented Kalman Filter • Problem with EKF : need Jacobian matrix A, error propagation • “It is easier to approximate a PDF than it is to approximate an arbitrary nonlinear function.” - J.K. Uhlmann • Select a set of deterministic sigma points{si ,wi}i=1,...,Nsuch that (a) wi = 1, (b)wisi = x, (c) wi(si-x)(si-x)T=x Example: Points along the covariance contour, si= x +(-1)i [(½N x)½]i/2 wi=1/N • Map si  h(si) • Compute “sample mean” wi h(si) and “sample covariance” wi h(si)hT(si) because ... i/2th column In fact, true up to the 2nd derivative because the covar. is the same

  12. What if the noise is also non-Gaussian? • For example, colored noise from wrong dynamics, tracking through clutter, deformation, etc. • 1st and 2nd order statistics are no longer sufficient to characterize the posterior distribution. • Answer : particle filter  condensation algorithm (CV)  Sequential Monte Carlo Method (statistics) • use Markov-Chain Monte Carlo (MCMC) method in time update, prediction, and measurement update

  13. Particle sets A particle is a pair of random variables : state x and its weight  0 A particle set for a PDF f is an algorithm to generate (xi, i) such that for any function g: limn i ig(xi) = Ef (g(x)) “Converge by distribution” Probability State xi = centroid of ellipse i = area of ellipse

  14. Operations on particles • Idea: use particles {xi, i}i=1,..Nto represent P(xt|y1,...,yt) • Recall • time update : P(xt+1 | y1,...,yt)=∫ P(xt+1|xt) P(xt | y1,...,yt) dxt “Convolution” • prediction : P(yt+1| y1,...,yt)=∫P(yt+1= z |xt+1) P(xt+1 | y1,...,yt) dxt+1 “Convolution” • measurement: P(xt+1 | y1,...,yt+1)  P(yt+1| xt+1) P(xt+1| y1,...,yt) “Multiplication” • Assume we know how to evaluate and generate random samples from all functions in red. • How to “convolve” and “multiple” sets of particles with other functions?

  15. Multiplication and Convolution of particles • Multiply by q(x) xi xi i  q(xi)  i • Convolution with q(y|x) • Resampling {xi, i}i=1..N to a new set of particles {xi’, i’} i=1..N {xi, i}i=1..N {xi’,i’}i=1..N • Reweighting xi’  Sample based on q(x|xi’) i’  i’

  16. Why resampling? • there are a lot more ... • better resampling : fewer xi have the same values • how many particles? Effective sampling size  = 0 Without resampling: With resampling:

  17. What about object feature? • Simplest way : feature vector + point target maximize P(f(yt+1)=f(zi)|f(y1),f(y2),...,f(yt)) P(yt+1=zi|y1, y2, ..., yt) • Too many possible zi if no foreground segmentation • occlusion • Mean-shift tracking • iterative hill-climbing algorithm Time t • centroid c0, object feature f • wf(v;t) = likelihood that It(v) is part of the object Time t+1 • New centroid of candidate obj. • c1 = vwf(v;t+1) / wf(v;t+1) • Move candidate to c1 and repeat candidate object

  18. References Basic Kalman Filter • M. I. Jordan, An Introduction to Probabilistic Graphical Models, in preparation. (ask me) • Forsyth and Ponce. (2003) Computer Vision, a modern approach. Prentice Hall. Chapter 17. (Sapphire) A little out-dated but encyclopedic on most aspects of tracking • Backman & Popoli, (1999) Design and Analysis of Modern Tracking Systems. Artech House Publishers. • Bar-Shalom, Li (1993) Estimation and Tracking: Principles, Techniques, and Software. Artech House Publishers. Unscented Kalman Filter • Julier, S. and J.K. Uhlmann. (2004) “Unscented filtering and nonlinear estimation,” Proceedings of IEEE, vol. 92, no.3, pp. 401-422. Mean-shift tracking • Cheng, Y. (1992) “Mean shift, mode seeking, and clustering” PAMI, vol.17, pp.790-799. • Comaniciu, D et al. (2000) “Real-time tracking of non-rigid objects using mean shift,” CVPR, vol.2, pp. 142-149. More on particle filtering • MacCormick. (2002) Stochastic algorithms for visual tracking. Springer. (Sapphire) • Doucet, A. (2001) Sequential Monte Carlo Methods in Practice. Springer. (Sapphire) • Hue, C. and J.-P. Le Cadre. (2002) “Sequential Monte Carlo methods for multiple target tracking and data fusion,” IEEE Trans. On signal processing, vol. 50, no.2, pp. 309-325. • Djuric, P.M. et al. (2003) “Particle Filtering,” IEEE Signal Processing magazine, vol. 20, no.5, pp. 19-38. • See Spengler’s reference (attached)

More Related