1 / 47

Jose-Luis Blanco , Javier González, Juan-Antonio Fernández-Madrigal

Jose-Luis Blanco , Javier González, Juan-Antonio Fernández-Madrigal. Dpt. of System Engineering and Automation. University of Málaga (Spain). An Optimal Filtering Algorithm for Non-Parametric Observation Models in Robot Localization. May 19-23 Pasadena, CA (USA). Outline of the talk.

eamon
Download Presentation

Jose-Luis Blanco , Javier González, Juan-Antonio Fernández-Madrigal

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Jose-Luis Blanco, Javier González, Juan-Antonio Fernández-Madrigal Dpt. of System Engineering and Automation University of Málaga (Spain) An Optimal Filtering Algorithm for Non-Parametric Observation Models in Robot Localization May 19-23 Pasadena, CA (USA)

  2. Outline of the talk 1. Introduction 2. The proposed method 3. Experimental results 4. Conclusions

  3. Outline of the talk 1. Introduction 2. The proposed method 3. Experimental results 4. Conclusions

  4. p(y|x) : observation likelihood p(x|y) : posterior belief 1. Introduction The addressed problem: Bayesian filtering p(x) : prior belief Two choices determine the tools suitable to solve this problem: • The representation of the prior/posterior densities: Gaussian vs. samples. • Assumptions about the form of the likelihood.

  5. Any arbitrary function (need to evaluate it pointwise) Weighted, random samples (particle filter) 1. Introduction In this work: Representation of pdfs? Observation likelihood?

  6. 1. Introduction The role of the proposal distribution in particle filters The basic particle filtering algorithm: p(x) : prior belief What happens to each particle?

  7. Draw new particles from the proposal distribution: 1. Introduction The role of the proposal distribution in particle filters The basic particle filtering algorithm: What happens to each particle?

  8. 1. Introduction The role of the proposal distribution in particle filters The basic particle filtering algorithm: • Weights are updated, depending on: • The observation likelihood, and • The proposal distribution. What happens to each particle?

  9. 1. Introduction The role of the proposal distribution in particle filters The basic particle filtering algorithm: • Weights are updated, depending on: • The observation likelihood, and • The proposal distribution. p(y|x) : observation likelihood What happens to each particle?

  10. 1. Introduction The role of the proposal distribution in particle filters The basic particle filtering algorithm: p(x|y) : posterior belief The goal  To approximate as well as possible the posterior How much does the choice of the proposal distribution matter?

  11. 1. Introduction The role of the proposal distribution in particle filters The basic particle filtering algorithm: q(·) : proposal distribution p(x|y) : posterior belief How much does the choice of the proposal distribution matter?

  12. 1. Introduction The role of the proposal distribution in particle filters The basic particle filtering algorithm: For a large mismatch between proposal and posterior, the particles represent the density very poorly: q(·) : proposal distribution p(x|y) : posterior belief How much does the choice of the proposal distribution matter?

  13. It is common to use the transition model as proposal: We refer to this choice as the standard proposal. It is far from optimal. 1. Introduction The role of the proposal distribution in particle filters The proposal distribution q(·) is the key for the efficiency of a particle filter! [Doucet et al. 2000] introduced the optimal proposal.

  14. 1. Introduction Relation of our method to other Bayesian filtering approaches:

  15. Outline of the talk 1. Introduction 2. The proposed method 3. Experimental results 4. Conclusions

  16. 2. The proposed method Our method: • A particle filter based on the optimal proposal [Doucet et al. 2000]. • Can deal with non-parameterized observation models, using rejection sampling to approximate the actual densities. • Integrates KLD-sampling [Fox 2003] for a dynamic sample size • (optional: it’s not fundamental to the approach). • The weights of all the samples are always equal.

  17. 2. The proposed method The theoretical model for each step of our method is this sequence of operations: Duplication  SIR with optimal proposal Fixed/Dyn. sample-size resampling

  18. 2. The proposed method The theoretical model for each step of our method is this sequence of operations: Duplication  SIR with optimal proposal Fixed/Dyn. sample-size resampling

  19. 2. The proposed method The theoretical model for each step of our method is this sequence of operations: Duplication  SIR with optimal proposal Fixed/Dyn. sample-size resampling

  20. 2. The proposed method The theoretical model for each step of our method is this sequence of operations: Duplication  SIR with optimal proposal Fixed/Dyn. sample-size resampling

  21. Particles at time t-1 2. The proposed method Illustrative example of how our method works: t t–1 [1] [2] [3] [4]

  22. 2. The proposed method Illustrative example of how our method works: t t–1 Group [1] [1] [2] [3] [4] Each particle propagates in time probabilistically: this is the reason of the duplication

  23. 2. The proposed method Illustrative example of how our method works: t t–1 Group [2] Group [1] [1] [2] [3] [4] Each particle propagates in time probabilistically: this is the reason of the duplication

  24. 2. The proposed method Illustrative example of how our method works: t t–1 Group [2] Group [1] [1] [2] [3] Group [3] [4] Each particle propagates in time probabilistically: this is the reason of the duplication

  25. 2. The proposed method Illustrative example of how our method works: t t–1 Group [2] Group [1] [1] [2] [3] Group [3] [4] Group [4] Each particle propagates in time probabilistically: this is the reason of the duplication

  26. 2. The proposed method Illustrative example of how our method works: t t–1 Group [2] Group [1] [1] [2] Observation likelihood [3] Group [3] [4] Too distant particles do not contribute to the posterior! Group [4] The observation likelihood states what particles are really important…

  27. 2. The proposed method Illustrative example of how our method works: t t–1 Group [2] Group [1] [1] [2] Observation likelihood [3] Group [3] [4] Group [4] We can predict which groups will be more important, before really generating the new samples!

  28. The weight does not depend on the actual value of the particle. 2. The proposed method The optimal proposal distribution: Importance weights update as:

  29. 2. The proposed method Illustrative example of how our method works: t t–1 Group [2] Group [1] [1] [2] Observation likelihood Group [1]  55% Group [2]  0% [3] Group [3]  45% Group [4]  0% Group [3] [4] Group [4]

  30. 2. The proposed method Illustrative example of how our method works: t t–1 Group [2] Group [1] [1] [2] Observation likelihood [3] A fixed or dynamic number of samples can be generated in this way. Group [3] [4] Group [4] Given the predictions, we draw particles according to the optimal proposal, only for those groups that really contribute to the posterior.

  31. 2. The proposed method Comparison to… basic Sequential Importance Resampling (SIR)

  32. 2. The proposed method Comparison to… basic Sequential Importance Resampling (SIR) t t–1 [1] [2] Observation likelihood [3] [4] 1 particle  1 particle Prone to particle depletion!

  33. 2. The proposed method Comparison to… Auxiliary Particle Filter (APF) [Pitt & Shephard, 1999]

  34. 2. The proposed method Comparison to… Auxiliary Particle Filter (APF) [Pitt & Shephard, 1999] t t–1 [1] [2] Observation likelihood [3] [4] 1 particle  variable number of particles Propagation does not use optimal proposal!

  35. Outline of the talk 1. Introduction 2. The proposed method 3. Experimental results 3.1. Numerical simulation 3.2. Robot localization 4. Conclusions

  36. 3.1. Results Numerical simulations: A Gaussian model for both the filtered density and the observation model. We compare the closed form optimal solution (Kalman filter) to:  PF using the “standard” proposal distribution.  Auxiliary PF method [Pitt & Shephard, 1999].  This work (“optimal” PF). (Fixed sample size for these experiments)

  37. x axis: particles y axis: weights Actual pdf from Kalman filter. Approximated pdf (histogram) from particles. Kullback-Leibler distance (KLD) for increasing number of samples. 3.1. Results Results from the numerical simulations, and comparison to 1D Kalman filter:

  38. 3.1. Results Results from the numerical simulations, and comparison to 1D Kalman filter:

  39. Outline of the talk 1. Introduction 2. The proposed method 3. Experimental results 3.1. Numerical simulation 3.2. Robot localization 4. Conclusions

  40. 1 m End Start Robot path during localization 3.2. Results Localization with real data: Path of the robot: ground truth from a RBPF with a large number of particles.

  41. Average positioning error (meters) 10 Standard proposal PF Our optimal PF 1 0.1 0.01 1 10 100 Number of particles 3.2. Results Localization with real data: Average errors in tracking (the particles are approximately at the right position from the beginning).

  42. 1 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 1 2 10 10 3.2. Results Localization with real data: Ratio of convergence from global localization: Our method Ratio of convergence success SIR with “standard” proposal Initial sample size (particles/m2)

  43. 3.2. Results

  44. Outline of the talk 1. Introduction 2. The proposed method 3. Experimental results 4. Conclusions

  45. Conclusions • A new particle filter algorithm has been introduced. • It can cope with non-parameterized observation likelihoods, and a • dynamic number of particles. • Compared to standard SIR, it provides more robust global localization • and pose tracking for similar computation times. • It is a generic algorithm: can be applied to other problems in • robotics, computer vision, etc.

  46. Source code (MRPT C++ libs), datasets, slides and instructions to reproduce the experiments available online: Finally… http://mrpt.sourceforge.net/ papers ICRA 08

  47. Jose-Luis Blanco, Javier González, Juan-Antonio Fernández-Madrigal Dpt. of System Engineering and Automation University of Málaga (Spain) An Optimal Filtering Algorithm for Non-Parametric Observation Models in Robot Localization Thanks for your attention!

More Related