1 / 98

Probabilistic Robotics: Review/SLAM

Advanced Mobile Robotics. Probabilistic Robotics: Review/SLAM. Dr. J izhong Xiao Department of Electrical Engineering CUNY City College jxiao@ccny.cuny.edu. Bayes Filter Revisit. Prediction (Action) Correction (Measurement). Probabilistic Robotics. Probabilistic Sensor Models

armand
Download Presentation

Probabilistic Robotics: Review/SLAM

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Advanced Mobile Robotics Probabilistic Robotics: Review/SLAM Dr. Jizhong Xiao Department of Electrical Engineering CUNY City College jxiao@ccny.cuny.edu

  2. Bayes Filter Revisit • Prediction (Action) • Correction (Measurement)

  3. Probabilistic Robotics Probabilistic Sensor Models Beam-based Model Likelihood Fields Model Feature-based Model

  4. zexp zexp 0 zmax 0 zmax Beam-based Proximity Model Measurement noise Unexpected obstacles Gaussian Distribution Exponential Distribution

  5. zexp zexp 0 zmax 0 zmax Beam-based Proximity Model Random measurement Max range Uniform distribution Point-mass distribution

  6. Resulting Mixture Density Weighted average, and How can we determine the model parameters? System identification method: maximum likelihood estimator (ML estimator)

  7. Likelihood Fields Model • Project the end points of a sensor scan Zt into the global coordinate space of the map • Probability is a mixture of … • a Gaussian distribution with mean at distance to closest obstacle, • a uniform distribution for random measurements, and • a small uniform distribution for max range measurements. • Again, independence between different components is assumed.

  8. Likelihood Fields Model Distance to the nearest obstacles. Max range reading ignored

  9. Example Example environment Likelihood field The darker a location, the less likely it is to perceive an obstacle P(z|x,m) Sensor probability Oi : Nearest point to obstacles O1 O2 O3 Zmax

  10. Feature-Based Measurement Model • Feature vector is abstracted from the measurement: • Sensors that measure range,bearing, & a signature (a numerical value, e.g., an average color) • Conditional independence between features • Feature-Based map: with i.e., a location coordinate in global coordinates & a signature • Robot pose: • Measurement model: Zero-mean Gaussian error variables with standard deviations

  11. Probabilistic Robotics Probabilistic Motion Models

  12. Odometry Model • Robot moves from to . • Odometry information . Relative motion information, “rotation” “translation”  “rotation”

  13. Noise Model for Odometry • The measured motion is given by the true motion corrupted with independent noise. How to calculate :

  14. odometry values (u) values of interest (xt-1, xt) Calculating the Posterior Given xt, xt-1, and u An initial pose Xt-1 A hypothesized final pose Xt A pair of poses u obtained from odometry • Algorithm motion_model_odometry (xt, xt-1, u) • return p1 · p2 · p3 Implements an error distribution over a with zero mean and standard deviation b

  15. Application • Repeated application of the sensor model for short movements. • Typical banana-shaped distributions obtained for 2d-projection of 3d posterior. p(xt| u, xt-1) x’ x’ u u Posterior distributions of the robot’s pose upon executing the motion command illustrated by the solid line. The darker a location, the more likely it is.

  16. Velocity-Based Model Rotation radius control

  17. Equation for the Velocity Model Instantaneous center of curvature (ICC) at (xc , yc) Initial pose Keeping constant speed, after ∆t time interval, ideal robot will be at Corrected, -90

  18. Velocity-based Motion Model With and are the state vectors at time t-1 and t respectively The true motion is described by a translation velocity and a rotational velocity Motion Control with additive Gaussian noise Circular motion assumption leads to degeneracy , 2 noise variables v and w  3D pose Assume robot rotates when arrives at its final pose

  19. Velocity-based Motion Model Motion Model: 1 to 4 are robot-specific error parameters determining the velocity control noise 5 and 6 are robot-specific error parameters determining the standard deviation of the additional rotational noise

  20. Probabilistic Motion Model How to compute ? Move with a fixed velocity during ∆t resulting in a circular trajectory from to Center of circle: with Radius of the circle: Change of heading direction: (angle of the final rotation)

  21. Posterior Probability for Velocity Model Center of circle Radius of the circle Change of heading direction Motion error: verr ,werr and

  22. Examples (velocity based)

  23. Approximation: Map-Consistent Motion Model  Obstacle grown by robot radius Map free estimate of motion model “consistency” of pose in the map “=0” when placed in an occupied cell

  24. Summary • We discussed motion models for odometry-based and velocity-based systems • We discussed ways to calculate the posterior probability p(x| x’, u). • Typically the calculations are done in fixed time intervals t. • In practice, the parameters of the models have to be learned. • We also discussed an extended motion model that takes the map into account.

  25. Probabilistic Robotics Localization “Using sensory information to locate the robot in its environment is the most fundamental problem to providing a mobile robot with autonomous capabilities.” [Cox ’91]

  26. Localization, Where am I? • Given • Map of the environment. • Sequence of measurements/motions. • Wanted • Estimate of the robot’s position. • Problem classes • Position tracking (initial robot pose is known) • Global localization (initial robot pose is unknown) • Kidnapped robot problem (recovery)

  27. Markov Localization

  28. Bayes Filter Revisit • Prediction (Action) • Correction (Measurement)

  29. EKF Linearization First Order Taylor Expansion • Prediction: • Correction:

  30. EKF Algorithm • Extended_Kalman_filter( mt-1,St-1, ut, zt): • Prediction: • Correction: • Returnmt,St

  31. EKF_localization ( mt-1,St-1, ut, zt,m):Prediction: Jacobian of g w.r.t location Jacobian of g w.r.t control Motion noise covariance Matrix from the control Predicted mean Predicted covariance

  32. Velocity-based Motion Model With and are the state vectors at time t-1 and t respectively The true motion is described by a translation velocity and a rotational velocity Motion Control with additive Gaussian noise

  33. Velocity-based Motion Model Motion Model:

  34. Velocity-based Motion Model Derivative of g along x’ dimension, w.r.t. x at Jacobian of g w.r.t location

  35. Velocity-based Motion Model Mapping between the motion noise in control space to the motion noise in state space Jacobian of g w.r.t control Derivative of g w.r.t. the motion parameters, evaluated at and

  36. EKF_localization ( mt-1,St-1, ut, zt,m):Correction: Predicted measurement mean Jacobian of h w.r.t location Pred. measurement covariance Kalman gain Updated mean Updated covariance

  37. Feature-Based Measurement Model • Jacobian of h w.r.t location Is the landmark that corresponds to the measurement of

  38. EKF Localizationwith known correspondences

  39. EKF Localizationwith unknown correspondences Maximum likelihood estimator

  40. EKF Prediction Step Initial estimate is represented by the ellipse centered at Moving on a circular arc of 90cm & turning 45 degrees to the left, the predicted position is centered at Small noise in translational & rotation High rotational noise High translational noise High noise in both translation & rotation

  41. EKF Measurement Prediction Step Measurement Prediction True robot (white circle) & the observation (bold line circle) Innovations (white arrows) : differences between observed & predicted measurements

  42. EKF Correction Step Measurement Prediction Resulting correction Update mean estimate & reduce position uncertainty ellipses

  43. Estimation Sequence (1) EKF localization with an accurate landmark detection sensor Dashed line: estimated robot trajectory Dashed line: corrected robot trajectory Uncertainty ellipses: before (light gray) & after (dark gray) incorporating landmark detection Solid line: true robot motion

  44. Estimation Sequence (2) EKF localization with a less accurate landmark detection sensor Uncertainty ellipses: before (light gray) & after (dark gray) incorporating landmark detection

  45. Comparison to Ground Truth

  46. UKF Localization • Given • Map of the environment. • Sequence of measurements/motions. • Wanted • Estimate of the robot’s position. • UKF localization

  47. Unscented Transform Sigma points Weights Pass sigma points through nonlinear function For n-dimensional Gaussian λ is scaling parameter that determine how far the sigma points are spread from the mean If the distribution is an exact Gaussian, β=2 is the optimal choice. Recover mean and covariance

  48. Motion noise UKF_localization ( mt-1,St-1, ut, zt,m): Prediction: Measurement noise Augmented state mean Augmented covariance Sigma points Prediction of sigma points Predicted mean Predicted covariance

  49. The predicted robot locations are used to generate the measurement sigma points Measurement sigma points UKF_localization ( mt-1,St-1, ut, zt,m): Correction: Predicted measurement mean Pred. measurement covariance Cross-covariance Kalman gain Updated mean Updated covariance

  50. UKF Prediction Step High rotational noise Moving on a circular arc of 90cm & turning 45 degrees to the left, the predicted position is centered at High translational noise High noise in both translation & rotation

More Related