1 / 22

Today

Today. Introduction to MCMC Particle filters and MCMC A simple example of particle filters: ellipse tracking. Introduction to MCMC. Sampling technique Non-standard distributions (hard to sample) High dimensional spaces Origins in statistical physics in 1940s

amalia
Download Presentation

Today

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Today • Introduction to MCMC • Particle filters and MCMC • A simple example of particle filters: ellipse tracking

  2. Introduction to MCMC • Sampling technique • Non-standard distributions (hard to sample) • High dimensional spaces • Origins in statistical physics in 1940s • Gained popularity in statistics around late 1980s • Markov ChainMonte Carlo

  3. Markov chains* • Homogeneous: T is time-invariant • Represented using a transition matrix Series of samples such that * C. Andrieu et al., “An Introduction to MCMC for Machine Learning“, Mach. Learn., 2003

  4. Markov chains • Evolution of marginal distribution • Stationary distribution • Markov chain T has a stationary distribution • Irreducible • Aperiodic Bayes’ theorem

  5. Markov chains • Detailed balance • Sufficient condition for stationarity of p • Mass transfer Probability mass Probability mass Proportion of mass transfer x(i) x(i-1) Pair-wise balance of mass transfer

  6. Metropolis-Hastings • Target distribution: p(x) • Set up a Markov chain with stationary p(x) • Resulting chain has the desired stationary • Detailed balance Propose (Easy to sample from q) with probability otherwise

  7. Metropolis-Hastings • Initial burn-in period • Drop first few samples • Successive samples are correlated • Retain 1 out of every M samples • Acceptance rate • Proposal distribution q is critical

  8. Monte-Carlo simulations* • Using N MCMC samples • Target density estimation • Expectation • MAP estimation • pis a posterior * C. Andrieu et al., “An Introduction to MCMC for Machine Learning“, Mach. Learn., 2003

  9. Tracking interacting targets* • Using partilce filters to track multiple interacting targets (ants) * Khan et al., “MCMC-Based Particle Filtering for Tracking a Variable Number of Interacting Targets”, PAMI, 2005.

  10. Particle filter and MCMC • Joint MRF Particle filter • Importance sampling in high dimensional spaces • Weights of most particles go to zero • MCMC is used to sample particles directly from the posterior distribution

  11. MCMC Joint MRF Particle filter • True samples (no weights) at each step • Stationary distribution for MCMC • Proposal density for Metropolis Hastings (MH) • Select a target randomly • Sample from the single target state proposal density

  12. MCMC Joint MRF Particle filter • MCMC-MH iterations are run every time step to obtain particles • “One target at a time” proposal has advantages: • Acceptance probability is simplified • One likelihood evaluation for every MH iteration • Computationally efficient • Requires fewer samples compared to SIR

  13. Particle filter for pupil (ellipse) tracking • Pupil center is a feature for eye-gaze estimation • Track pupil boundary ellipse Outliers Pupil boundary edge points Ellipse overlaid on the eye image

  14. Tracking • Brute force: Detect ellipse every video frame • RANSAC: Computationally intensive • Better: Detect + Track • Ellipse usually does not change too much between adjacent frames • Principle • Detect ellipse in a frame • Predict ellipse in next frame • Refine prediction using data available from next frame • If track lost, re-detect and continue

  15. Particle filter? • State: Ellipse parameters • Measurements: Edge points • Particle filter • Non-linear dynamics • Non-linear measurements • Edge points are the measured data

  16. Motion model • Simple drift with rotation State (x0 , y0) θ Could include velocity, acceleration etc. a b Gaussian

  17. z6 z5 d6 d5 d1 z1 z2 z4 d4 d2 d3 z3 Likelihood • Exponential along normal at each point • di: Approximated using focal bisector distance

  18. Focal bisector distance* (FBD) • Reflection property: PF’ is a reflection of PF • Favorable properties • Approximation to spatial distance to ellipse boundary along normal • No dependence on ellipse size Foci FBD Focal bisector * P. L. Rosin, “Analyzing error of fit functions for ellipses”, BMVC 1996.

  19. Implementation details • Sequential importance re-sampling* • Number of particles:100 • Expected state is the tracked ellipse • Possible to compute MAP estimate? Weights: Likelihood Proposal distribution: Mixture of Gaussians * Khan et al., “MCMC-Based Particle Filtering for Tracking a Variable Number of Interacting Targets”, PAMI, 2005.

  20. Initial results Frame 1: Detect Frame 2: Track Frame 3: Track Frame 4: Detect Frame 5: Track Frame 6: Track

  21. Future? • Incorporate velocity, acceleration into the motion model • Use a domain specific motion model • Smooth pursuit • Saccades • Combination of them? • Data association* to reduce outlier confound * Forsyth and Ponce, “Computer Vision: A Modern Approach”, Chapter 17.

  22. Thank you!

More Related