Tutorial on Particle Filters
Download
1 / 68

Outline - PowerPoint PPT Presentation


  • 189 Views
  • Uploaded on

Tutorial on Particle Filters assembled and extended by Longin Jan Latecki Temple University, [email protected] using slides from. Keith Copsey, Pattern and Information Processing Group, DERA Malvern; D. Fox, J. Hightower, L. Liao, D. Schulz, and G. Borriello, Univ. of Washington, Seattle

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Outline' - Faraday


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

Tutorial on Particle Filters

assembled and extended by Longin Jan Latecki

Temple University, [email protected]

using slides from

Keith Copsey, Pattern and Information Processing Group, DERA Malvern;

D. Fox, J. Hightower, L. Liao, D. Schulz, and G. Borriello,

Univ. of Washington, Seattle

Honggang Zhang, Univ. of Maryland, College Park

Miodrag Bolic, University of Ottawa, Canada

Michael Pfeiffer, TU Gratz, Austria


Outline
Outline

  • Introduction to particle filters

    • Recursive Bayesian estimation

  • Bayesian Importance sampling

    • Sequential Importance sampling (SIS)

    • Sampling Importance resampling (SIR)

  • Improvements to SIR

    • On-line Markov chain Monte Carlo

  • Basic Particle Filter algorithm

  • Example for robot localization

  • Conclusions


Particle filters
Particle Filters

  • Sequential Monte Carlo methods for on-line learning within a Bayesian framework.

  • Known as

    • Particle filters

    • Sequential sampling-importance resampling (SIR)

    • Bootstrap filters

    • Condensation trackers

    • Interacting particle approximations

    • Survival of the fittest


History
History

  • First attempts – simulations of growing polymers

    • M. N. Rosenbluth and A.W. Rosenbluth, “Monte Carlo calculation of the average extension of molecular chains,” Journal of Chemical Physics, vol. 23, no. 2, pp. 356–359, 1956.

  • First application in signal processing - 1993

    • N. J. Gordon, D. J. Salmond, and A. F. M. Smith, “Novel approach to nonlinear/non-Gaussian Bayesian state estimation,” IEE Proceedings-F, vol. 140, no. 2, pp. 107–113, 1993.

  • Books

    • A. Doucet, N. de Freitas, and N. Gordon, Eds., Sequential Monte Carlo Methods in Practice, Springer, 2001.

    • B. Ristic, S. Arulampalam, N. Gordon, Beyond the Kalman Filter: Particle Filters for Tracking Applications, Artech House Publishers, 2004.

  • Tutorials

    • M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, “A tutorial on particle filters for online nonlinear/non-gaussian Bayesian tracking,” IEEE Transactions on Signal Processing, vol. 50, no. 2, pp. 174–188, 2002.


Problem statement
Problem Statement

  • Tracking the state of a system as it evolves over time

  • Sequentially arriving (noisy or ambiguous) observations

  • We want to know: Best possible estimate of the hidden variables


Solution sequential update
Solution: Sequential Update

  • Storing and processing all incoming measurements is inconvenient and may be impossible

  • Recursive filtering:

    –Predict next state pdf from current estimate

    –Update the prediction using sequentially arriving new measurements

  • Optimal Bayesian solution: recursively calculating exact posterior density


Particle filtering ideas
Particle filtering ideas

  • Particle filter is a technique for implementing recursive Bayesian filter by Monte Carlo sampling

  • The idea: represent the posterior density by a set of random particles with associated weights.

  • Compute estimates based on these samples and weights

Posterior density

Sample space


Global Localization of Robot with Sonarhttp://www.cs.washington.edu/ai/Mobile_Robotics/mcl/animations/global-floor.gif


Tools needed
Tools needed

Recall “law of total probability” (or marginalization)

and “Bayes’ rule”


Recursive bayesian estimation i
Recursive Bayesian estimation (I)

  • Recursive filter:

    • System model:

    • Measurement model:

    • Information available:


Recursive bayesian estimation ii
Recursive Bayesian estimation (II)

  • Seek:

    • i = 0: filtering.

    • i > 0: prediction.

    • i<0: smoothing.

  • Prediction:

    • since:



Bayes filters second pass

System state dynamics

Observation dynamics

We are interested in: Belief or posterior density

Bayes Filters (second pass)

Estimating system state from noisy observations



Assumptions: Markov Process

Predict:

Update:


Bayes Filter

How to use it? What else to know?

Motion Model

Perceptual Model

Start from:


Step 0: initialization

Step 1: updating

Example 1


Step 3: updating

Step 4: predicting

Step 2: predicting

Example 1 (continue)

1


Classical approximations
Classical approximations

  • Analytical methods:

    • Extended Kalman filter,

    • Gaussian sums… (Alspach et al. 1971)

      • Perform poorly in numerous cases of interest

  • Numerical methods:

    • point masses approximations,

    • splines. (Bucy 1971, de Figueiro 1974…)

      • Very complex to implement, not flexible.


Perfect monte carlo simulation
Perfect Monte Carlo simulation

  • Recall that

  • Random samples are drawn from the posterior distribution.

  • Represent posterior distribution using a set of samples or particles.

  • Easy to approximate expectations of the form:

    • by:


Random samples and the pdf i
Random samples and the pdf (I)

  • Take p(x)=Gamma(4,1)

  • Generate some random samples

  • Plot histogram and basic approximation to pdf

200 samples


Random samples and the pdf ii
Random samples and the pdf (II)

500 samples

1000 samples


Random samples and the pdf iii
Random samples and the pdf (III)

200000 samples

5000 samples


Importance sampling
Importance Sampling

  • Unfortunately it is often not possible to sample directly from the posterior distribution, but we can use importance sampling.

  • Let p(x) be a pdf from which it is difficult to draw samples.

  • Let xi ~ q(x), i=1, …, N, be samples that are easily generated from a proposal pdf q, which is called an importance density.

  • Then approximation to the density p is given by

where


Bayesian importance sampling
Bayesian Importance Sampling

  • By drawing samples from a known easy to sample proposal distribution we obtain:

where

are normalized weights.


Sequential importance sampling i
Sequential Importance Sampling (I)

  • Factorizing the proposal distribution:

  • and remembering that the state evolution is modeled as a Markov process

  • we obtain a recursive estimate of the importance weights:

  • Factorizing is obtained by recursively applying


Sequential Importance Sampling (SIS) Particle Filter

SIS Particle Filter Algorithm

fori=1:N

Draw a particle

Assign a weight

end

(k is index over time and i is the particle index)


Derivation of sis weights i
Derivation of SIS weights (I)

  • The main idea is Factorizing :

and

Our goal is to expand p and q in time t



Derivation of SIS weights (II)

and under Markov assumptions


Sis particle filter foundation
SIS Particle Filter Foundation

  • At each time step k

  • Random samples are drawn from the proposal distribution for i=1, …, N

  • They represent posterior distribution using a set of samples or particles

  • Since the weights are given by

  • and q factorizes as


Sequential importance sampling ii
Sequential Importance Sampling (II)

  • Choice of the proposal distribution:

  • Choose proposal function to minimize variance of (Doucet et al. 1999):

  • Although common choice is the prior distribution:

    We obtain then


Sequential importance sampling iii
Sequential Importance Sampling (III)

  • Illustration of SIS:

  • Degeneracy problems:

    • variance of importance ratios increases stochastically over time (Kong et al. 1994; Doucet et al. 1999).

    • In most cases then after a few iterations, all but one particle will have negligible weight


Sequential importance sampling iv
Sequential Importance Sampling (IV)

  • Illustration of degeneracy:


Sis why variance increase
SIS - why variance increase

  • Suppose we want to sample from the posterior

    • choose a proposal density to be very close to the posterior density

      • Then

      • and

  • So we expect the variance to be close to 0 to obtain reasonable estimates

    • thus a variance increase has a harmful effect on accuracy


Sampling importance resampling
Sampling-Importance Resampling

  • SIS suffers from degeneracy problems so we don’t want to do that!

  • Introduce a selection (resampling) step to eliminate samples with low importance ratios and multiply samples with high importance ratios.

  • Resampling maps the weighted random measure on to the equally weighted random measure

    • by sampling uniformly with replacement from with probabilities

  • Scheme generates children such that and satisfies:


Basic sir particle filter schematic
Basic SIR Particle Filter - Schematic

Initialisation

measurement

Resampling

step

Importance

sampling step

Extract estimate,


Basic sir particle filter algorithm i
Basic SIR Particle Filter algorithm (I)

  • Initialisation

    • For sample

    • and set

  • Importance Sampling step

    • For sample

    • For compute the importance weights wik

    • Normalise the importance weights,

and set


Basic sir particle filter algorithm ii
Basic SIR Particle Filter algorithm (II)

  • Resampling step

    • Resample with replacement particles:

    • from the set:

    • according to the normalised importance weights,

  • Set

    • proceed to the Importance Sampling step, as the next measurement arrives.



Generic SIR Particle Filter algorithm

  • M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, “A tutorial on particle filters …,” IEEE Trans. on Signal Processing, 50( 2), 2002.


Improvements to sir i
Improvements to SIR (I)

  • Variety of resampling schemes with varying performance in terms of the variance of the particles :

    • Residual sampling (Liu & Chen, 1998).

    • Systematic sampling (Carpenter et al., 1999).

    • Mixture of SIS and SIR, only resample when necessary (Liu & Chen, 1995; Doucet et al., 1999).

  • Degeneracy may still be a problem:

    • During resampling a sample with high importance weight may be duplicated many times.

    • Samples may eventually collapse to a single point.


Improvements to sir ii
Improvements to SIR (II)

  • To alleviate numerical degeneracy problems, sample smoothing methods may be adopted.

    • Roughening (Gordon et al., 1993).

      • Adds an independent jitter to the resampled particles

    • Prior boosting (Gordon et al., 1993).

      • Increase the number of samples from the proposal distribution to M>N,

      • but in the resampling stage only draw N particles.


Improvements to sir iii
Improvements to SIR (III)

  • Local Monte Carlo methods for alleviating degeneracy:

    • Local linearisation - using an EKF (Doucet, 1999; Pitt & Shephard, 1999) or UKF (Doucet et al, 2000) to estimate the importance distribution.

    • Rejection methods (Müller, 1991; Doucet, 1999; Pitt & Shephard, 1999).

    • Auxiliary particle filters (Pitt & Shephard, 1999)

    • Kernel smoothing (Gordon, 1994; Hürzeler & Künsch, 1998; Liu & West, 2000; Musso et al., 2000).

    • MCMC methods (Müller, 1992; Gordon & Whitby, 1995; Berzuini et al., 1997; Gilks & Berzuini, 1998; Andrieu et al., 1999).


Improvements to sir iv
Improvements to SIR (IV)

  • Illustration of SIR with sample smoothing:


Ingredients for smc
Ingredients for SMC

  • Importance sampling function

    • Gordon et al

    • Optimal 

    • UKF  pdf from UKF at

  • Redistribution scheme

    • Gordon et al SIR

    • Liu & Chen  Residual

    • Carpenter et al  Systematic

    • Liu & Chen, Doucet et al  Resample when necessary

  • Careful initialisation procedure (for efficiency)


Particle filters1
Particle filters

  • Also known as Sequential Monte Carlo Methods

  • Representing belief by sets of samples or particles

  • are nonnegative weights called importance factors

  • Updating procedure is sequential importance sampling with re-sampling


Example 2 particle filter

Step 0: initialization

Each particle has the same weight

Step 1: updating weights. Weights are proportional to p(z|x)

Example 2: Particle Filter


Example 2 particle filter1

Step 3: updating weights. Weights are proportional to p(z|x)

Step 4: predicting.

Predict the new locations of particles.

Step 2: predicting.

Predict the new locations of particles.

Example 2: Particle Filter

Particles are more concentrated in the region where the person is more likely to be


Compare particle filter with bayes filter with known distribution
Compare Particle Filter with Bayes Filter with Known Distribution

Updating

Example 1

Example 2

Predicting

Example 1

Example 2







Tracking in 1d the blue trajectory is the target the best of10 particles is in red
Tracking in 1D: the blue trajectory is the target.The best of10 particles is in red.



Application examples
Application Examples tracked.

  • Robot localization

  • Robot mapping

  • Visual Tracking

    –e.g. human motion (body parts)

  • Prediction of (financial) time series

    –e.g. mapping gold price to stock price

  • Target recognition from single or multiple images

  • Guidance of missiles

  • Contour grouping

  • Nice video demos:

    http://www.cs.washington.edu/ai/Mobile_Robotics/mcl/


2nd book advert
2nd Book Advert tracked.

  • Statistical Pattern Recognition

  • Andrew Webb, DERA

  • ISBN 0340741643,

  • Paperback: 1999: £29.99

  • Butterworth Heinemann

  • Contents:

    • Introduction to SPR, Estimation, Density estimation, Linear discriminant analysis, Nonlinear discriminant analysis - neural networks, Nonlinear discriminant analysis - statistical methods, Classification trees, Feature selction and extraction, Clustering, Additional topics, Measures of dissimilarity, Parameter estimation, Linear algebra, Data, Probability theory.


Homework
Homework tracked.

  • Implement all three particle filter algorithms

    SIS Particle Filter Algorithm (p. 27)

    Basic SIR Particle Filter algorithm (p. 39,40)

    Generic SIR Particle Filter algorithm (p. 42)

  • and evaluate their performance on a problem of your choice.

  • Groups of two are allowed.

  • Submit a report and a ready to run Matlab code (with a script and the data).

  • Present a report to the class.


ad