1 / 60

Probabilistic Robotics

Probabilistic Robotics. Bayes Filter Implementations Gaussian filters. Bayes Filter Reminder. Prediction: Correction:. m. Univariate. - s. s. m. Multivariate. Gaussians (1D and ND). Properties of Gaussians. Multivariate Gaussians.

cathy
Download Presentation

Probabilistic Robotics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Probabilistic Robotics Bayes Filter Implementations Gaussian filters

  2. Bayes Filter Reminder • Prediction: • Correction:

  3. m Univariate -s s m Multivariate Gaussians (1D and ND)

  4. Properties of Gaussians

  5. Multivariate Gaussians • We stay in the “Gaussian world” as long as we start with Gaussians and perform linear transformations.

  6. Discrete Kalman Filter Estimates the state x of a discrete-time controlled process that is governed by the linear stochastic difference equation: with a measurement:

  7. Components of a Kalman Filter Matrix (nxn) that describes how the state evolves from t-1 to t without controls or noise. Matrix (nxl) that describes how the control ut changes the state from t-1 to t. Matrix (kxn) that describes how to map the state xt to an observation zt. Random variables representing the process and measurement noise that are assumed to be independent and normally distributed with covariance Rt and Qt respectively.

  8. Kalman Filter Updates in 1D

  9. Kalman Filter Control Updates

  10. Kalman Filter Measurement Updates

  11. Kalman Filter Updates

  12. Linear Gaussian system: Initialize • Initial belief is normally distributed:

  13. Linear Gaussian system: Dynamics • Dynamics are linear function of state and control plus additive noise:

  14. Linear Gaussian system: Dynamics

  15. Linear Gaussian systems: Observations • Observations are linear function of state plus additive noise:

  16. Linear Gaussian systems: Observations

  17. Kalman Filter Algorithm • Algorithm Kalman_filter( mt-1,St-1, ut, zt): • Prediction: • Correction: • Returnmt,St

  18. Prediction The Prediction-Correction Cycle

  19. Correction The Prediction-Correction Cycle

  20. Prediction Correction The Prediction-Correction Cycle

  21. Kalman Filter Summary • Highly efficient: Polynomial in measurement dimensionality k and state dimensionality n: O(k2.376 + n2) • Optimal for linear Gaussian systems! • Limiting assumptions: • Observations are linear functions of state. State transition are linear. • Unimodal beliefs. • Most robotics systems are nonlinear and beliefs are multimodal!

  22. Extended Kalman Filter (EKF) • Most realistic robotic problems involve nonlinear functions. • EKF supports such non-linear functions; relaxes linearity assumption. • However, beliefs are no longer Gaussian 

  23. Linearity Assumption Revisited

  24. Non-linear Function

  25. Linearization in EKF • Sequence of steps for linearization in EKF. • Compute tangent to function g() at mean. • Consider the tangent as the linearized approximation of g(). • Project p(x) through linear approximation. • Compute mean and covariance of y. This defines the Gaussian approximation of the underlying non-linear transformation.

  26. EKF Linearization (1)

  27. EKF Linearization (2)

  28. EKF Linearization (3)

  29. Why Linearize? • Remember limiting assumptions of KF: • Observations are linear functions of state. State transition are linear. • Unimodal beliefs. • Assumptions do not hold in practice. • Relax linearity assumption. However, makes beliefs non-Gaussian  • EKF computes Gaussian approximation of true belief through linearization of non-linear functions g() and h(). • Achieve linearization through (first-order) Taylor expansion (Section 3.3.2).

  30. EKF Linearization: First Order Taylor Series Expansion • Prediction: • Correction: • Derivation of EKF (Section 3.3.4).

  31. EKF Algorithm • Extended_Kalman_filter( mt-1,St-1, ut, zt): • Prediction: • Correction: • Returnmt,St

  32. Localization “Using sensory information to locate the robot in its environment is the most fundamental problem to providing a mobile robot with autonomous capabilities.” [Cox ’91] • Given • Map of the environment. • Sequence of sensor measurements. • Wanted • Estimate of the robot’s position. • Problem classes • Position tracking. • Global localization. • Kidnapped robot problem (recovery).

  33. Landmark-based Localization

  34. EKF_localization ( mt-1,St-1, ut, zt,m):Prediction: Jacobian of g w.r.t location Jacobian of g w.r.t control Motion noise Predicted mean Predicted covariance

  35. EKF_localization ( mt-1,St-1, ut, zt,m):Correction: Predicted measurement mean Jacobian of h w.r.t location Pred. measurement covariance Kalman gain Updated mean Updated covariance

  36. EKF Prediction Step

  37. EKF Observation Prediction Step

  38. EKF Correction Step

  39. Estimation Sequence (1)

  40. Estimation Sequence (2)

  41. Comparison to Ground Truth

  42. EKF Summary • Highly efficient: Polynomial in measurement dimensionality k and state dimensionality n: O(k2.376 + n2) • Not optimal! • Can diverge if nonlinearities are large! • Works surprisingly well even when all assumptions are violated!

  43. Unscented Kalman Filter • Stochastic linearization through unscented transform. • Extract sigma-points from Gaussian. • Mean and symmetric points along main axes of covariance. • N-dim Gaussian => 2n+1 sigma points. • Two weights for each sigma point, one each to compute mean and covariance. • Encode additional knowledge about underlying distribution. • Project sigma points through g(). • Compute mean and covariance of projected points.

  44. Linearization via Unscented Transform UKF EKF

  45. UKF Sigma-Point Estimate (2) EKF UKF

  46. UKF Sigma-Point Estimate (3) UKF EKF

  47. Unscented Transform Sigma points Weights Pass sigma points through nonlinear function: Recover mean and covariance:

  48. Motion noise UKF_localization ( mt-1,St-1, ut, zt,m): Prediction: Measurement noise Augmented state mean Augmented covariance Sigma points Prediction of sigma points Predicted mean Predicted covariance

  49. Measurement sigma points UKF_localization ( mt-1,St-1, ut, zt,m): Correction: Predicted measurement mean Pred. measurement covariance Cross-covariance Kalman gain Updated mean Updated covariance

  50. UKF Prediction Step

More Related