1 / 38

Sensor-Based Mapping and Sensor Fusion

Sensor-Based Mapping and Sensor Fusion. By Rashmi Patel. Overview. Bayes Rule Occupancy Grids [Alberto Elfes, 1987] Vector Field Histograms-[J. Borenstein, 1991] Sensor Fusion [David Conner, 2000]. Bayes Rule.

mora
Download Presentation

Sensor-Based Mapping and Sensor Fusion

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Sensor-Based Mapping andSensor Fusion By Rashmi Patel

  2. Overview • Bayes Rule • Occupancy Grids [Alberto Elfes, 1987] • Vector Field Histograms-[J. Borenstein, 1991] • Sensor Fusion [David Conner, 2000]

  3. Bayes Rule • Posterior (Conditional) probability – probability assigned to a event given some evidence • Conditional Prob. Example- Flipping coins: • P(H) = 0.5 P(H | H) = 1 • P(HH) = 0.25 P(HH | first flip H) =0.5

  4. Bayes Rule- continued • Bayes Rule: P(A|B) = P(A)P(B|A) P(B) • The useful thing about Bayes Rule is that it allows you to “turn” conditional probabilities around • Example: P(Cancer) = 0.1, P(Smoker) = 0.5, P(S|C) = 0.8 P(C|S) = ? P(C|S) = P(S|C)*P(C) / P(S) = 0.16

  5. Occupancy Grids [Elfes] • In the mid 80’s Elfes starting implementing cheap ultrasonic transducers on an autonomous robot • Because of intrinsic limitations in any sonar, it is important to compose a coherent world-model using information gained from multiple reading

  6. y x Occupancy Grids Defined • The grid stores the probability that Ci = cell(x,y) is occupied O(Ci) = P[s(Ci) = OCC](Ci) • Phases of Creating a Grid: • Collect reading generating O(Ci) • Update Occ. Grid creating a map • Match and Combine maps from multiple locations Ci

  7. 22.5 deg Occupancy Grids Sonar Pattern • 24 transducers, in a ring, spaced 15 degrees a part • Sonars can detect from 0.9-35 ft • Accuracy is  0.1 ft • Main sensitivity in a 30 cone • -3db sensitivity from middle 15 (1/2 response) Beam Pattern

  8. Occupancy Grids Sonar Model • Probability Profile- Gaussian p.d.f. is used but that is variable p(r | z,)=1/(2r)*exp[ (-(r-z)2/2r2) - ((2)/2)] • Where r is the sensor reading and z is actual distance Distance R Ranging error Somewhere Occupied Probably Empty Range Measurement Rmin angle

  9. Occupancy Grids Discrete sonar model EX.

  10. y x Occupancy Grids Notation • Definitions: • Ci is a cell in the occupancy grid • s(Ci) is the state of cell Ci (i.e. value of that cell) • OCC means OCCUPIED and whose value is 1 Ci

  11. Occupancy Grids Bayes Rule • Applying Baye’s Rule to a single cell s(Ci) with sensor reading r: P[s(Ci) = OCC | r] = P[r | s(Ci) = OCC] * P[s(Ci) = OCC] --------------------------------------------------------------------------------  p[r | s(Ci)] * P[s(Ci)] • Where p(r) =  p[r | s(Ci)] * P[s(Ci)] summed over the cells that intercept the sensor model • Then apply this to all the cells creating a local map for each sensor

  12. Occupancy Grids Bayes Rule Implemented Prior Likelihood P[s(Ci) = OCC | r] =P[r | s(Ci) = OCC] * P[s(Ci) = OCC] -----------------------------------------------------------------------------------------  p[r | s(Ci) = OCC] * P[s(Ci) = OCC] • P[s(Ci) = OCC | r] is the probability that a cell is Occupied given a sensor reading r • P[r | s(Ci) = OCC] is the probability that sensor reading is ‘r’ given the state of cell Ci (this value is found by using the sensor model) • P[s(Ci) = OCC] is the probability that the value of cell Ci is 1 or that s(Ci) = OCC (this value is taken from the occupancy grid) Normalize

  13. Let the red oval be the somewhere Occupied region The Yellow blocks are in the sonar sector The black lines are the boundaries of that sonar sector P(r) = Sum over all of those yellow block using the sonar model to figure out the Probability P(r) Occupancy Grids Implementation Ex Occupied Range y x

  14. Occupancy Grids Multiple Sonars Combining Readings from Multiple Sonars: • The Grid is updated sequentially for t sensors {r}t = {r1,…,rt} • To update for new sensor reading rt+1: P[s(Ci) = OCC | rt+1] = P[rt+1 | s(Ci) = OCC] * P[s(Ci) = OCC|{r}t] --------------------------------------------------------------------------------  p[rt+1 | s(Ci)] * P[s(Ci) |{r}t]

  15. Occupancy Grids Equations P[s(Ci) = OCC | rt+1] = P[rt+1 | s(Ci) = OCC] * P[s(Ci) = OCC|{r}t] -------------------- -----------------------------------------------------------------------------------------  P[rt+1 | s(Ci) = OCC] * P[s(Ci) = OCC|{r}t] • P[s(Ci) = OCC | rt+1]is the probability that a cell is Occupied given a sensor reading r • P[rt+1 | s(Ci) = OCC] is the probability that sensor reading is ‘r’ given the state of cell Ci (this value is found by using the sensor model) • P[s(Ci) = OCC|{r}t]is the probability that the value of cell Ci is 1 or that s(Ci) = OCC (this value is taken from the occupancy grid)

  16. Occupancy Grids Multiple Maps Matching Multiple Maps • Each new map must be integrated with existing maps from past sensor readings • The maps are integrated by finding the best rotation and translation transform which results in the maps having best correlation in overlapping areas Occupancy Grid

  17. Occupancy Grids Matching Maps Ex Example 1: A simple translation of maps Center of Robot at (2,2) New Map: Combined map 1&2 Map 1: Map 2: After translating P(cell3) = P(cell1)+P(cell2)- P(cell1)*P(cell2)

  18. Occupancy Grids Vs Certainty Grids • Occupancy Grids and Certainty Grids basically the same in the method that is used to • Collect readings to generate Probability Occupied and for Certainty grids Probability Empty • Create grid from different sonars • Match Maps to register from other locations • Difference arise from the fact that Occ. Grids use conditional prob. to determine Probability Occupied while Certainty Grids use simpler math models

  19. Occupancy Grids Vs Certainty Grids • Both have a P.d.f for the sonar model • However the major difference is in finding the probability that a cell is occupied • First Pempty is computed for a cell • Then Poccupied is computed using Pocc = 1-Pemp • Then Pocc is normalized over the sonar beam and combined with the value of that cell from other sonars and Pocc(sonar reading r)

  20. Vector Field Histograms[Borenstien] • The VFH allows fast continuous control over a mobile vehicle • Tested on CARMEL using 24 ultrasonic sensors placed in a ring around the robot • The scan times range from 100 to 500 ms depending on the level of safety wanted

  21. Vector Field HistogramsNotation • The VFH uses a two dimensional Cartesian Histogram grid similar to certainty grids [Elfes] • Definition: • CVmax = 15 • Cvmin = 0 • d is the distance returned by the sonar • Increment value is 3 • Decrement value is –1 • VCP is Vehicle center point • Obstacle vector-vector point from cell to VCP

  22. Vector Field HistogramsHistogram Grid • The histogram grid is incremented differently from the certainty grid • The only cell incremented in the grid is the cell which is distance d away and lying on the acoustic axis of the sonar • Similarly only the cells on the acoustic axis and are less than distance d are decremented

  23. Vector Field HistogramsPolar Histogram • Next the 2-D histogram grid is converted into a 1-D grid called the Polar histogram • The Polar Histogram, H, has n angular sections with width a

  24. Vector Field HistogramsH mapping • In order to generate H, we must map an every cell in the histogram grid into H

  25. Vector Field HistogramsDiscrete H grid • Now that the Object vectors for every cell have been computed, we have to find the magnitude of each sector in H

  26. Vector Field HistogramsThreshold • Once the Polar Object Densities have been computed, H can be threshold to determine where the objects are located so that they can be avoided. • The choose of this threshold is important. Choosing to high a threshold and you may come too close to a object and too low may cause you to lose some valid paths

  27. Sensor Fusion [D. Conner] David C ConnerPhD Student Presentation on his thesis and the following paper: “Multiple camera, laser rangefinder, and encoder data fusion for navigation of a differentially steered 3-wheeled autonomous vehicle”

  28. Sensor Fusion Navigator • 2 front wheels are driven and third rear wheel is a caster • 2 separate computer systems, PC handles sensor fusion and PLC handles motor control • 180 degree laser range finder with 0.5 resolution • 2 color CCD cameras Navigator- 3wheeled differentially driven vehicle

  29. Cameras and Frame Grabbers • Because the camera is not parallel to the ground the image must be transformed to correctly represent the ground • The correction is done using the Intel Image Processing Library (IPL)

  30. Cameras and Frame Grabbers • Since there are two cameras the two images must be combined • The images are transformed into the vehicle coordinates and combined using the IPL functions

  31. Image Conversion Once a picture is captured • It is converted to gray scale • It is blurred using a gaussian Convolution mask Example shown is below

  32. Image Conversion continued • Then the threshold of image is taken to limit the amount of data in the image • The threshold value is chosen to be above the norm of the intensities from the gray scale histogram • Then resulting image is pixilated to store in a grid

  33. Laser Range Finder • SICK LMS-200 laser rangefinder return 361 data points for a 180 degree arc with 0.5 degree resolution • Values above a certain range or ignored

  34. Vector Field Histograms • VFH are nice because it allows us to easily combine our camera data and our laser rangefinder data to determine most accessible regions • Several types of polar obstacle density (POD) functions can be used (linear, quadratic, exponential) • POD = KC(a-b*d)

  35. POD values-Laser Rangefinder • The POD values for the laser are determined by: • Using the linear function shown above to transform the laser data into POD values • Then for every two degrees a max of the POD values in that arc is chosen as the final value

  36. POD values-Images • POD values for the image are pre-calculated and stored in a grid at startup • The pre-calculated values are multiplied by the pixilated image also stored in a grid. (The overlapping cells would be multiplied) • For every 2degree arc the cell with the highest POD values is chosen to the value of that arc

  37. Combining VFH • The two VFH’s are then combined by taking the max POD for each sector • The max POD is chosen because that represents the closest object

  38. Bibliography • Elfes, A. “Occupancy Grids: A Stochastic Spatial Representation for Active Robot Perception.” July 1990 • Elfes, A. “Sonar-Based Real-World Mapping and Navigation”, June 1987 • Borenstein, J. “The Vector Field Histogram- Fast Obstacle Avoidance for Mobile Robots”, June 1991 • Conner, D. “Multiple camera, laser rangefinder, and encoder data fusion for navigation of a differentially steered 3-wheeled autonomous vehicle”,

More Related