1 / 62

Probabilistic Robotics

Probabilistic Robotics. Bayes Filter Implementations Gaussian filters. Bayes Filter Reminder. Prediction (Action) Correction (Measurement). m. Univariate. - s. s. m. Multivariate. Gaussians. Properties of Gaussians. Multivariate Gaussians.

jag
Download Presentation

Probabilistic Robotics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Probabilistic Robotics Bayes Filter Implementations Gaussian filters

  2. Bayes Filter Reminder • Prediction (Action) • Correction (Measurement)

  3. m Univariate -s s m Multivariate Gaussians

  4. Properties of Gaussians

  5. Multivariate Gaussians • We stay in the “Gaussian world” as long as we start with Gaussians and perform only linear transformations.

  6. Discrete Kalman Filter Estimates the state x of a discrete-time controlled process that is governed by the linear stochastic difference equation with a measurement

  7. Components of a Kalman Filter Matrix (nxn) that describes how the state evolves from t-1 to t without controls or noise. Matrix (nxl) that describes how the control ut changes the state from t-1 to t. Matrix (kxn) that describes how to map the state xt to an observation zt. Random variables representing the process and measurement noise that are assumed to be independent and normally distributed with covariance Rt and Qt respectively.

  8. Kalman Filter Updates in 1D Measurement with associated uncertainty Initial belief Integrating measurement into belief via KF Measurement

  9. Kalman Filter Updates in 1D Kalman gain

  10. Kalman Filter Updates in 1D Motion to the right, == Introduces uncertainty

  11. Kalman Filter Updates Belief after motion to right New measurement Resulting belief

  12. Linear Gaussian Systems: Initialization • Initial belief is normally distributed:

  13. Linear Gaussian Systems: Dynamics • Dynamics are linear function of state and control plus additive noise: State transition probability

  14. Linear Gaussian Systems: Dynamics

  15. Linear Gaussian Systems: Observations • Observations are linear function of state plus additive noise: Measurement probability

  16. Linear Gaussian Systems: Observations

  17. Kalman Filter Algorithm • Algorithm Kalman_filter( mt-1,St-1, ut, zt): • Prediction: • Correction: • Returnmt,St

  18. Prediction The Prediction-Correction-Cycle

  19. Correction The Prediction-Correction-Cycle

  20. Prediction Correction The Prediction-Correction-Cycle

  21. Kalman Filter Summary • Highly efficient: Polynomial in measurement dimensionality k and state dimensionality n: O(k2.376 + n2) • Optimal for linear Gaussian systems! • Most robotics systems are nonlinear!

  22. Nonlinear Dynamic Systems • Most realistic robotic problems involve nonlinear functions

  23. Linearity Assumption Revisited

  24. Non-linear Function

  25. EKF Linearization (1)

  26. EKF Linearization (2)

  27. EKF Linearization (3)

  28. EKF Linearization: First Order Taylor Series Expansion • Prediction: • Correction:

  29. EKF Algorithm • Extended_Kalman_filter( mt-1,St-1, ut, zt): • Prediction: • Correction: • Returnmt,St

  30. EKF Summary • Highly efficient: Polynomial in measurement dimensionality k and state dimensionality n: O(k2.376 + n2) • Not optimal! • Can diverge if nonlinearities are large! • Works surprisingly well even when all assumptions are violated!

  31. Linearization via Unscented Transform EKF UKF

  32. UKF Sigma-Point Estimate (2) EKF UKF

  33. UKF Sigma-Point Estimate (3) EKF UKF

  34. Unscented Transform Sigma points Weights Pass sigma points through nonlinear function For n-dimensional Gaussian λ is scaling parameter that determine how far the sigma points are spread from the mean If the distribution is an exact Gaussian, β=2 is the optimal choice. Recover mean and covariance

  35. UKF Summary • Highly efficient: Same complexity as EKF, with a constant factor slower in typical practical applications • Better linearization than EKF: Accurate in first two terms of Taylor expansion (EKF only first term) • Derivative-free: No Jacobians needed • Still not optimal!

  36. Thank You

  37. Localization “Using sensory information to locate the robot in its environment is the most fundamental problem to providing a mobile robot with autonomous capabilities.” [Cox ’91] • Given • Map of the environment. • Sequence of sensor measurements. • Wanted • Estimate of the robot’s position. • Problem classes • Position tracking (initial robot pose is known) • Global localization (initial robot pose is unknown) • Kidnapped robot problem (recovery)

  38. Markov Localization Markov Localization: The straightforward application of Bayes filters to the localization problem

  39. EKF_localization ( mt-1,St-1, ut, zt,m):Prediction: Jacobian of g w.r.t location Jacobian of g w.r.t control Motion noise covariance Matrix from the control Predicted mean Predicted covariance

  40. EKF_localization ( mt-1,St-1, ut, zt,m):Correction: Predicted measurement mean Jacobian of h w.r.t location Pred. measurement covariance Kalman gain Updated mean Updated covariance

  41. EKF Prediction Step

  42. EKF Observation Prediction Step

  43. EKF Correction Step

  44. Estimation Sequence (1)

  45. Estimation Sequence (2)

  46. Comparison to GroundTruth

  47. Motion noise UKF_localization ( mt-1,St-1, ut, zt,m): Prediction: Measurement noise Augmented state mean Augmented covariance Sigma points Prediction of sigma points Predicted mean Predicted covariance

  48. Measurement sigma points UKF_localization ( mt-1,St-1, ut, zt,m): Correction: Predicted measurement mean Pred. measurement covariance Cross-covariance Kalman gain Updated mean Updated covariance

  49. EKF_localization ( mt-1,St-1, ut, zt,m):Correction: Predicted measurement mean Jacobian of h w.r.t location Pred. measurement covariance Kalman gain Updated mean Updated covariance

  50. UKF Prediction Step

More Related