1 / 29

The Unscented Particle Filter

The Unscented Particle Filter. 2000/09/29 이 시은. Introduction. Filtering estimate the states(parameters or hidden variable) as a set of observations becomes available on-line To solve it modeling the evolution of the system and noise

pascoe
Download Presentation

The Unscented Particle Filter

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Unscented Particle Filter 2000/09/29 이 시은

  2. Introduction • Filtering • estimate the states(parameters or hidden variable) as a set of observations becomes available on-line • To solve it • modeling the evolution of the system and noise • Resulting models • non-linearity and non-Gaussian distribution

  3. Extended Kalman filter • linearize the measurements and evolution models using Taylor series • Unscented Kalman Filter • not apply to general non Gaussian distribution • Seq. Monte Carlo Methods : Particle filters • represent posterior distribution of states. • any statistical estimates can be computed. • deal with nonlinearities distribution

  4. Particle Filter • rely on importance sampling • design of proposal distribution • Proposal for Particle Filter • EKF Gaussian approximation • UKF proposal • control rate at which tails go to zero • heavy tailed distribution

  5. Dynamic State Space Model • Transition equation and a measurement’s equation • Goal • approximate the posterior • one of marginals, filtering density recursively

  6. Extended Kalman Filter • MMSE estimator based on Taylor expansion of nonlinear f and g around estimate of state

  7. Unscented Kalman Filter • Not approximate non-linear process and observation models • Use true nonlinear models and approximate distribution of the state random variable • Unscented transformation

  8. Particle Filtering • Not require Gaussian approximation • Many variations, but based on sequential importance sampling • degenerate with time • Include resampling stage

  9. Perfect Monte Carlo Simulation • A set of weighted particles(samples) drawn from the posterior • Expectation

  10. Bayesian Importance Sampling • Impossible to sample directly from the posterior • sample from easy-to-sample, proposal distribution

  11. Asymptotic convergence and a central theorem for under the following assumptions • i.i.d samples drawn from the proposal, support of the proposal include support of posterior and finite exists. • Expectation of , exist and are finite.

  12. Sequential Importance Sampling • Proposal distribution • assumption • state: Markov process • observations: independent given states

  13. we can sample from the proposal and evaluate likelihood and transition probability, generate a prior set of samples and iteratively compute the importance weights

  14. Choice of proposal distribution • Minimize variance of the importance weights • popular choice • move particle towards the region of high likelihood

  15. Degeneracy of SIS algorithm • Variance of importance ratios increases stochastically over time

  16. Selection(Resampling) • Eliminate samples with low importance ratios and multiply samples with high importance ratios. • Associate to each particle a number of children

  17. SIR and Multinomial sampling • Mapping Dirac random measure onto an equally weighted random measure • Multinomial distribution

  18. Residual resampling • Set • perform an SIR procedure to select remaining samples with new weights • add the results to the current

  19. Minimum variance samplingWhen to sample

  20. Generic Particle Filter 1. Initialization t=0 2. For t=1,2, … (a) Importance sampling step for I=1, …N, sample: evaluate importance weight normalize the importance weights (b) Selection (resampling) (c) output

  21. Improving Particle Filters • Monte Carlo(MC) assumption • Dirac point-mass approx. provides an adequate representaion of posterior • Importance sampling(IS) assumption • obtain samples from posterior by sampling from a suitable proposal and apply importance sampling corrections.

  22. MCMC Move Step • Introduce MCMC steps of invariant distribution • If particles are distributed according to the posterior then applying a Markov chain transition kernel

  23. Designing Better Importance Proposals • Move samples to regions of high likelihood • prior editing • ad-hoc acceptance test of proposing particles • Local linearization • Taylor series expansion of likelihood and transition prior • ex) • improved simulated annealed sampling algorithm

  24. Rejection methods • If likelihood is bounded, sample from optimal importance distribution

  25. Auxiliary Particle Filters • Obtain approximate samples from the optimal importance distribution by an auxiliary variable k. • draw samples from joint distribution

  26. Unscented Particle Filter • Using UKF for proposal distribution generation within a particle filter framework

  27. Theoretical Convergence • Theorem1 If importance weight is upper bounded for any and if one of selection schemes, then for all , there exists independent of N s.t. for any

More Related