1 / 12

Stereo Matching

Stereo Matching. Iolanthe II racing in Waitemata Harbour. Segment-based Belief Propagation. Segmentation. #1 on the Middlebury rankings Klaus, Sormann and Karner, “Segment-based Stereo Matching Using Belief Propagation and a Self-Adapting Dissimilarity Measure”, ICPR (3) 2006 : 15-18

scott
Download Presentation

Stereo Matching

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. StereoMatching Iolanthe II racing in Waitemata Harbour Segment-based Belief Propagation

  2. Segmentation • #1 on the Middlebury rankings • Klaus, Sormann and Karner, “Segment-based Stereo Matching Using Belief Propagation and a Self-Adapting Dissimilarity Measure”, ICPR (3) 2006: 15-18 • Basic idea • Depth (disparity) changes should occur at region boundaries in the image so • Segment the image and match the patches

  3. Algorithm Pair of rectified images Extract homogenous regions in reference image Apply local window based matching Extract set of disparity planes Approximate optimaldisparity plane assignment Disparity map

  4. Step 1 - Segmentation • Decompose reference image • Regions of homogenous colour | grey-scale • Assume • Disparity values vary smoothly within these regions • Note: Possibly sloping planes • Fronto-planar assumption not required! • Mean-shift colour segmentation used • Gradient descent search for maxima in a density function over a high dimensional feature space Feature space: spatial coordinates + associated attributes(including edge information)‏

  5. Step 2 – Local matching • Local matching in pixel domain • Window based correlation (SSD or SAD)‏ • Gradient or non-parametric matching algorithms possible • Better tolerate gain and offset changes • KSK use self-adapting combination score • C(x,y,d) = (1 - ) CSAD(x,y,d) + Cgrad(x,y,d) • where • CSAD(x,y,d) = i,jN | IL(i,j) – IR(i+d,j) | • Cgrad(x,y,d) = i,jN |x IL(i,j) - xIR(i+d,j) | +i,jN |y IL(i,j) - yIR(i+d,j) | LRDifference in Intensity LR Difference in Horizontal Gradient LRDifference in Vertical Gradient

  6. Step 2 – Local matching • KSK use self-adapting combination score • C(x,y,d) = (1 - ) CSAD(x,y,d) + Cgrad(x,y,d) • where • CSAD(x,y,d) = i,jN | IL(i,j) – IR(i+d,j) | • Cgrad(x,y,d) = i,jN |x IL(i,j) - xIR(i+d,j) | +i,jN |y IL(i,j) - yIR(i+d,j) | • N(x,y) is a 3  3 window • Nx(x,y) is a 3  2 window; Ny(x,y) is a 2  3 window • Weighting, chosen • Winner-take-all – choose disparity with lowest cost + • Maximising number of reliable correspondences filtered out by LR crosscheck LRDifference in Intensity LR Difference in Horizontal Gradient LRDifference in Vertical Gradient

  7. Step 2 – Local matching • Weighting, chosen • Winner-take-all – choose disparity with lowest cost + • Maximising number of reliable correspondences filtered out by LR crosscheck • Normalized dissimilarity measure used • Reliable correspondences used to estimate signal-noise ratio (SNR)‏ • Because of normalization, a fixed truncation threshhold is set just above the noise level • Robust matching score

  8. Step 3 – Disparity plane estimation • Allow sloping planes • Abandon fronto-planar surface assumption! • Disparity plane estimation • Specify the plane with c1, c2, c3 : • d = c1x + c2y + c3 • Large number of possible planes • Use only reliable disparities • Decompose fitting problem: Decompose fitting problem: Fit • Horizontal and vertical slant separately • Find d/x ( d/y ) for each pixel • Distribution of derivatives d/x ( d/y ) is convolved with Gaussian kernel to determine mean d/x • Slant in the centre of a segment then determined by finding distribution of centre disparities for each reliable point as before

  9. Step 3 – Disparity plane estimation • Allow sloping planes • Abandon fronto-planar surface assumption! • Estimate disparity planes • Specify a plane with c1, c2, c3 : • d = c1x + c2y + c3 • Large number of possible planes • Use only reliable disparities • Decompose fitting problem: Fit • Horizontal and vertical slant separately • Find d/x ( d/y ) for each pixel • Distribution of derivatives d/x ( d/y ) is convolved with Gaussian kernel to determine mean d/x • Slant in the centre of a segment then determined from distribution of centre disparities for each reliable point as before

  10. Step 4 – Disparity plane refinement • Increase the accuracy of the disparity plane • Group regions that belong to the same disparity plane • Calculate a matching cost • For the plane, P, fitted to segment, S: • CSEG(S,P) = (x,y)S C(x,y,d)‏ • Disparity plane with the minimum matching cost is assigned to each segment • Segments assigned to that disparity plane are grouped • Repeat for all grouped segments Mismatch costfor disparity, d For each pixel in segment, S

  11. Step 5 – Disparity plane assignment • Search for optimal segment  disparity place assignment • Formulate as an energy minimization problem • Find a labelling, f, which matches a segment, sR, to a plane, f(s) in D • ‘Energy’ for labelling f is • E(f) = Edata(f) + Esmooth(f)‏ • where • Edata(f) = sR CSEG(s,f(s))‏ • and • Esmooth(f) = ((si,sj)SN|f(si)  f(sj) )disc(si,sj)‏ • disc(si,sj) – discontinuity penalty – includes common border lengths and colour similarity SN set of adjacent segments

  12. Step 5 • Use loopy belief propagation to find optimal labeling with minimum energy • Results: • Matching – state-of-the-art! • Top ‘average’ rank on the Middlebury set • Sloping planes of ‘Venus’ are well handled  • Still has problems with edges!! • See results in fig 2 of the paper • Pixelization not handled? • Computation cost? • Not mentioned in paper! • Algorithm contains many repeated steps • Analysis of benefit of each one useful

More Related