Loading in 2 Seconds...

Tutorial on Particle Filters assembled and extended by Longin Jan Latecki Temple University, latecki@temple.edu using s

Loading in 2 Seconds...

- 121 Views
- Uploaded on

Download Presentation
## PowerPoint Slideshow about 'Tutorial on Particle Filters assembled and extended by Longin Jan Latecki Temple University, latecki@temple.edu using s' - emilie

**An Image/Link below is provided (as is) to download presentation**

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

assembled and extended by Longin Jan Latecki

Temple University, latecki@temple.edu

using slides from

Keith Copsey, Pattern and Information Processing Group, DERA Malvern;

D. Fox, J. Hightower, L. Liao, D. Schulz, and G. Borriello,

Univ. of Washington, Seattle

Honggang Zhang, Univ. of Maryland, College Park

Miodrag Bolic, University of Ottawa, Canada

Michael Pfeiffer, TU Gratz, Austria

Outline

- Introduction to particle filters
- Recursive Bayesian estimation
- Bayesian Importance sampling
- Sequential Importance sampling (SIS)
- Sampling Importance resampling (SIR)
- Improvements to SIR
- On-line Markov chain Monte Carlo
- Basic Particle Filter algorithm
- Example for robot localization
- Conclusions

Particle Filters

- Sequential Monte Carlo methods for on-line learning within a Bayesian framework.
- Known as
- Particle filters
- Sequential sampling-importance resampling (SIR)
- Bootstrap filters
- Condensation trackers
- Interacting particle approximations
- Survival of the fittest

History

- First attempts – simulations of growing polymers
- M. N. Rosenbluth and A.W. Rosenbluth, “Monte Carlo calculation of the average extension of molecular chains,” Journal of Chemical Physics, vol. 23, no. 2, pp. 356–359, 1956.
- First application in signal processing - 1993
- N. J. Gordon, D. J. Salmond, and A. F. M. Smith, “Novel approach to nonlinear/non-Gaussian Bayesian state estimation,” IEE Proceedings-F, vol. 140, no. 2, pp. 107–113, 1993.
- Books
- A. Doucet, N. de Freitas, and N. Gordon, Eds., Sequential Monte Carlo Methods in Practice, Springer, 2001.
- B. Ristic, S. Arulampalam, N. Gordon, Beyond the Kalman Filter: Particle Filters for Tracking Applications, Artech House Publishers, 2004.
- Tutorials
- M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, “A tutorial on particle filters for online nonlinear/non-gaussian Bayesian tracking,” IEEE Transactions on Signal Processing, vol. 50, no. 2, pp. 174–188, 2002.

Problem Statement

- Tracking the state of a system as it evolves over time
- Sequentially arriving (noisy or ambiguous) observations
- We want to know: Best possible estimate of the hidden variables

Solution: Sequential Update

- Storing and processing all incoming measurements is inconvenient and may be impossible
- Recursive filtering:

–Predict next state pdf from current estimate

–Update the prediction using sequentially arriving new measurements

- Optimal Bayesian solution: recursively calculating exact posterior density

Particle filtering ideas

- Particle filter is a technique for implementing recursive Bayesian filter by Monte Carlo sampling
- The idea: represent the posterior density by a set of random particles with associated weights.
- Compute estimates based on these samples and weights

Posterior density

Sample space

Recursive Bayesian estimation (I)

- Recursive filter:
- System model:
- Measurement model:
- Information available:

Recursive Bayesian estimation (II)

- Seek:
- i = 0: filtering.
- i > 0: prediction.
- i<0: smoothing.
- Prediction:
- since:

Recursive Bayesian estimation (III)

- Update:
- where:
- since:

Observation dynamics

We are interested in: Belief or posterior density

Bayes Filters (second pass)Estimating system state from noisy observations

Classical approximations

- Analytical methods:
- Extended Kalman filter,
- Gaussian sums… (Alspach et al. 1971)
- Perform poorly in numerous cases of interest
- Numerical methods:
- point masses approximations,
- splines. (Bucy 1971, de Figueiro 1974…)
- Very complex to implement, not flexible.

Perfect Monte Carlo simulation

- Recall that
- Random samples are drawn from the posterior distribution.
- Represent posterior distribution using a set of samples or particles.
- Easy to approximate expectations of the form:
- by:

Random samples and the pdf (I)

- Take p(x)=Gamma(4,1)
- Generate some random samples
- Plot histogram and basic approximation to pdf

200 samples

Importance Sampling

- Unfortunately it is often not possible to sample directly from the posterior distribution, but we can use importance sampling.
- Let p(x) be a pdf from which it is difficult to draw samples.
- Let xi ~ q(x), i=1, …, N, be samples that are easily generated from a proposal pdf q, which is called an importance density.
- Then approximation to the density p is given by

where

Bayesian Importance Sampling

- By drawing samples from a known easy to sample proposal distribution we obtain:

where

are normalized weights.

Sequential Importance Sampling (I)

- Factorizing the proposal distribution:
- and remembering that the state evolution is modeled as a Markov process
- we obtain a recursive estimate of the importance weights:
- Factorizing is obtained by recursively applying

Sequential Importance Sampling (SIS) Particle Filter

SIS Particle Filter Algorithm

fori=1:N

Draw a particle

Assign a weight

end

(k is index over time and i is the particle index)

Derivation of SIS weights (I)

- The main idea is Factorizing :

and

Our goal is to expand p and q in time t

Derivation of SIS weights (II)

and under Markov assumptions

SIS Particle Filter Foundation

- At each time step k
- Random samples are drawn from the proposal distribution for i=1, …, N
- They represent posterior distribution using a set of samples or particles
- Since the weights are given by
- and q factorizes as

Sequential Importance Sampling (II)

- Choice of the proposal distribution:
- Choose proposal function to minimize variance of (Doucet et al. 1999):
- Although common choice is the prior distribution:

We obtain then

Sequential Importance Sampling (III)

- Illustration of SIS:
- Degeneracy problems:
- variance of importance ratios increases stochastically over time (Kong et al. 1994; Doucet et al. 1999).
- In most cases then after a few iterations, all but one particle will have negligible weight

Sequential Importance Sampling (IV)

- Illustration of degeneracy:

SIS - why variance increase

- Suppose we want to sample from the posterior
- choose a proposal density to be very close to the posterior density
- Then
- and
- So we expect the variance to be close to 0 to obtain reasonable estimates
- thus a variance increase has a harmful effect on accuracy

Sampling-Importance Resampling

- SIS suffers from degeneracy problems so we don’t want to do that!
- Introduce a selection (resampling) step to eliminate samples with low importance ratios and multiply samples with high importance ratios.
- Resampling maps the weighted random measure on to the equally weighted random measure
- by sampling uniformly with replacement from with probabilities
- Scheme generates children such that and satisfies:

Basic SIR Particle Filter - Schematic

Initialisation

measurement

Resampling

step

Importance

sampling step

Extract estimate,

Basic SIR Particle Filter algorithm (I)

- Initialisation
- For sample
- and set

- Importance Sampling step
- For sample
- For compute the importance weights wik
- Normalise the importance weights,

and set

Basic SIR Particle Filter algorithm (II)

- Resampling step
- Resample with replacement particles:
- from the set:
- according to the normalised importance weights,
- Set
- proceed to the Importance Sampling step, as the next measurement arrives.

Generic SIR Particle Filter algorithm

- M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, “A tutorial on particle filters …,” IEEE Trans. on Signal Processing, 50( 2), 2002.

Improvements to SIR (I)

- Variety of resampling schemes with varying performance in terms of the variance of the particles :
- Residual sampling (Liu & Chen, 1998).
- Systematic sampling (Carpenter et al., 1999).
- Mixture of SIS and SIR, only resample when necessary (Liu & Chen, 1995; Doucet et al., 1999).
- Degeneracy may still be a problem:
- During resampling a sample with high importance weight may be duplicated many times.
- Samples may eventually collapse to a single point.

Improvements to SIR (II)

- To alleviate numerical degeneracy problems, sample smoothing methods may be adopted.
- Roughening (Gordon et al., 1993).
- Adds an independent jitter to the resampled particles
- Prior boosting (Gordon et al., 1993).
- Increase the number of samples from the proposal distribution to M>N,
- but in the resampling stage only draw N particles.

Improvements to SIR (III)

- Local Monte Carlo methods for alleviating degeneracy:
- Local linearisation - using an EKF (Doucet, 1999; Pitt & Shephard, 1999) or UKF (Doucet et al, 2000) to estimate the importance distribution.
- Rejection methods (Müller, 1991; Doucet, 1999; Pitt & Shephard, 1999).
- Auxiliary particle filters (Pitt & Shephard, 1999)
- Kernel smoothing (Gordon, 1994; Hürzeler & Künsch, 1998; Liu & West, 2000; Musso et al., 2000).
- MCMC methods (Müller, 1992; Gordon & Whitby, 1995; Berzuini et al., 1997; Gilks & Berzuini, 1998; Andrieu et al., 1999).

Improvements to SIR (IV)

- Illustration of SIR with sample smoothing:

Ingredients for SMC

- Importance sampling function
- Gordon et al
- Optimal
- UKF pdf from UKF at
- Redistribution scheme
- Gordon et al SIR
- Liu & Chen Residual
- Carpenter et al Systematic
- Liu & Chen, Doucet et al Resample when necessary
- Careful initialisation procedure (for efficiency)

Particle filters

- Also known as Sequential Monte Carlo Methods
- Representing belief by sets of samples or particles
- are nonnegative weights called importance factors
- Updating procedure is sequential importance sampling with re-sampling

Each particle has the same weight

Step 1: updating weights. Weights are proportional to p(z|x)

Example 2: Particle FilterStep 3: updating weights. Weights are proportional to p(z|x)

Step 4: predicting.

Predict the new locations of particles.

Step 2: predicting.

Predict the new locations of particles.

Example 2: Particle FilterParticles are more concentrated in the region where the person is more likely to be

Compare Particle Filter with Bayes Filter with Known Distribution

Updating

Example 1

Example 2

Predicting

Example 1

Example 2

Global Localization of Robot with Sonarhttp://www.cs.washington.edu/ai/Mobile_Robotics/mcl/animations/global-floor.gif

Global Localization of Robot with Sonarhttp://www.cs.washington.edu/ai/Mobile_Robotics/mcl/animations/global-floor.gif

Tracking in 1D: the blue trajectory is the target.The best of10 particles is in red.

Application Examples

- Robot localization
- Robot mapping
- Visual Tracking

–e.g. human motion (body parts)

- Prediction of (financial) time series

–e.g. mapping gold price to stock price

- Target recognition from single or multiple images
- Guidance of missiles
- Contour grouping
- Nice video demos:

http://www.cs.washington.edu/ai/Mobile_Robotics/mcl/

2nd Book Advert

- Statistical Pattern Recognition
- Andrew Webb, DERA
- ISBN 0340741643,
- Paperback: 1999: £29.99
- Butterworth Heinemann
- Contents:
- Introduction to SPR, Estimation, Density estimation, Linear discriminant analysis, Nonlinear discriminant analysis - neural networks, Nonlinear discriminant analysis - statistical methods, Classification trees, Feature selction and extraction, Clustering, Additional topics, Measures of dissimilarity, Parameter estimation, Linear algebra, Data, Probability theory.

Homework

- Implement all three particle filter algorithms

SIS Particle Filter Algorithm (p. 27)

Basic SIR Particle Filter algorithm (p. 39,40)

Generic SIR Particle Filter algorithm (p. 42)

- and evaluate their performance on a problem of your choice.
- Groups of two are allowed.
- Submit a report and a ready to run Matlab code (with a script and the data).
- Present a report to the class.

Download Presentation

Connecting to Server..