1 / 56

Qingxiong Yang, Student Member, IEEE, Liang Wang, Student Member, IEEE,

Stereo Matching with Color-Weighted Correlation, Hierarchical Belief Propagation, and Occlusion Handling. Qingxiong Yang, Student Member, IEEE, Liang Wang, Student Member, IEEE, Ruigang Yang, Member, IEEE, Henrik Stewe ´ nius , Member, IEEE, and David Niste ´ r, Member, IEEE.

step
Download Presentation

Qingxiong Yang, Student Member, IEEE, Liang Wang, Student Member, IEEE,

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Stereo Matching with Color-Weighted Correlation, Hierarchical Belief Propagation,and Occlusion Handling Qingxiong Yang, Student Member, IEEE, Liang Wang, Student Member, IEEE, Ruigang Yang, Member, IEEE, HenrikStewe´ nius, Member, IEEE, and David Niste´ r, Member, IEEE IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, VOL. 31, NO. 3, MARCH 2009

  2. Outline • Introduction • System Overview • Methods • Initialization • Pixel Classification • Iterative Refinement • Fast-Converging Belief Propagation • Depth Enhancement • Experiments • Conclusion

  3. Introduction

  4. Introduction • Stereois one of the most extensively researched topics in computer vision. • Energy Minimization framework: • Graph Cut • Belief Propagation(BP)

  5. Objective(Contribution) • To formulate stereo model with careful handling of: • Disparity • Discontinuity • Occlusion • Differs from the normal framework in the final stages of the algorithm • Outperforms all other algorithms on the average

  6. System Overview

  7. 1) Initialization

  8. 1) Initialization

  9. 1) Initialization

  10. 2) Pixel Classification

  11. 3) Iterative Refinement

  12. Initialization(Block 1)

  13. Initialization • Input: • Left Image IL • Right Image IR • Output: • Initial Left Disparity Map DL(0) • Initial Right Disparity Map DR • Initial Data Term ED(0) • CL • CR • ED(0) • DR • DL(0)

  14. Initialization • Color-weighted Correlation • To build the Correlation Volume • Makes the match scores less sensitive to occlusion boundaries • By using the fact that occlusion boundaries most often cause color discontinuities as well • CL • CR • ED(0) • DR • DL(0)

  15. Correlation Volume • Color difference Δxybetween pixel x and y (in the same image) Ic: Intensity of the color channel c • The weight of pixel x in the support window of y: • 10 • 21 • Color Difference • Spatial Difference

  16. Correlation Volume • The Correlation Volume[27]: • Wx :support window around x • d(yL,yR) : pixel dissimilarity[1] • xL , yL : pixels in left image IL • xR, yR: corresponding pixels in right image IR • dx : disparity value of pixel XL in IL • Dissimilarity[1] • weight • Pixels in the window • weight xR = xL – dx yR= yL– dx dx = arg min CL,x(yL,yR)

  17. Correlation Volume • Bad Pixel • Disparity Map [1] S. Birchfield and C. Tomasi, “A Pixel Dissimilarity Measure That Is Insensitive to Image Sampling,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 20, pp. 401-406, 1998. [27] K.-J. Yoon and I.-S. Kweon, “Adaptive Support-Weight Approach for Correspondence Search,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 28, pp. 650-656, 2006.

  18. Initialization • Initial Data Term • Total energy = Data Term + Smooth Term • Computed from Correlation Volume • Given an iteration index i= 0here because it will be iteratively refined • CL • CR • ED(0) • DR • DL(0)

  19. Initial Data Term • Initial Data Term: • Ƞbp : twice the average of correlation volume to exclude the outliers • Average Correlation Volume • Correlation Volume • X 2 • 0.2

  20. Initialization • hierarchical Belief Propagation • Employed with the data term and the reference image • Resulting in the initial left and right disparity mapsDL(0) and DR • CL • CR • ED(0) • DR • DL(0)

  21. Pixel Classification(Block 2)

  22. Pixel Classification • Output • Input

  23. Pixel Classification • Mutual Consistency Check • Requires that the disparity value from the left and right disparity maps are consistent, i.e., • Not Pass : occluded pixel • Pass : unoccluded pixel =>Correlation Confidence Measure

  24. Pixel Classification • Correlation Confidence • Based on how distinctive the highest peak in a pixel's correlation profile is • : the cost for the best disparity value • : the cost for the second best disparity value • 0.04 • If > αs stable • Else unstable dx = arg min CL,x(yL,yR)

  25. Iterative Refinement(Block 3)

  26. Input • Goal: to propagate information from the stable pixels to the unstable and the occluded pixels • Iteration

  27. Iterative Refinement • Color Segmentation • Color segments in IL are extrated by Mean Shift[6] [6] D. Comaniciu and P. Meer, “Mean Shift: A Robust Approach Toward Feature Space Analysis,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 24, pp. 603-619, 2002.

  28. Iterative Refinement • Plane Fitting • Using the disparity values for the stable pixels in each color segment • Disparity values are taken from the current hypothesisfor the left disparity mapDL(i). (Initial:DL(0)) • The plane-fitted depth map is used as a regularization for the new disparity estimation.

  29. Iterative Refinement [10] M.A. Fischler and R.C. Bolles, “Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography,” Comm. ACM, vol. 24, pp. 381-395, 1981. • Plane Fitting • Using RANSAC[10] • Iterates until the plane parameters converge

  30. Iterative Refinement • Plane Fitting output : D(i) • The ratio of stable pixels of each segment: • If Ratio > ȠS • Stable pixels: D(i) • Unstable, Occluded pixels: D(i) • If Ratio ≤ȠS • All pixels : D(i) pf • 0.7 L pf pf

  31. Iteration

  32. Iterative Refinement • Absolute Difference: • D(i+1): New Disparity Map • D(i): Plane-fitted Disparity Map • Data Term: L pf • 2.0 • 0.5 • 0.05

  33. Belief Propagation • The core energy minimization of our algorithm is carried out via the hierarchical BP algorithm. • Total Energy for Pixel X • Data Term • Smooth Term

  34. Max-Product Belief Propagation • Max-Product BP[25] : • : Message vector passed from pixel X to one of its neighbors Y • Data Term • Jump Cost [25] Y. Weiss and W. Freeman, “On the Optimality of Solutions of the Max-Product Belief PropagationAlgorithm in Arbitrary Graphs,” IEEE Trans. Information Theory, vol. 2, pp. 732-735, 2001.

  35. Max-Product Belief Propagation x Y

  36. Max-Product Belief Propagation • Jump Cost: • dx:Disparity of pixel X • d:Disparity of pixel Y (X’s neighbor) • αbp: The number of disparity levels / 8 • ρs: 1 – (normalized average color difference) • ρbp: The rate of increase in the cost • Disparity Difference • of pixel X and its neigbor Y • 1

  37. Max-Product Belief Propagation • Total Energy for pixel X: • Finally the label d that minimizes the total Energy for each pixel is selected. • Data Tem • Smooth Tem

  38. Hierarchical Belief Propagation • Standard loopy BP algorithm is too slow. • Hierarchical BP[9] runs much faster while maintaining comparable accuracy. • Works in a coarse-to-fine manner [9] P.F. Felzenszwalb and D.P. Huttenlocher, “Efficient Belief Propagation for Early Vision,” Proc. IEEE CS Conf. Computer Vision and Pattern Recognition, vol. 1, pp. 261-268, 2004.

  39. Hierarchical Belief Propagation • Finer • (Level 0) • Coarser • (Level 1)

  40. Fast-Converging Belief Propagation • Alarge number of iterations is required to guarantee convergence in a standard BP algorithm. • Fast-Converging BP effectively removes the redundant computation. • Only updating the pixels that have not yet converged (value bigger than ȠZ ) • 0.1

  41. Fast-Converging Belief Propagation

  42. Depth Enhancement

  43. Depth Enhancement • To reduce the discontinuities caused by the quantization • Sub-pixel Estimation algorithm is proposed. • Cost Function:

  44. Depth Enhancement • The depth with the minimum of the cost function: • d: the discrete depth with the minimal cost • d+: d+1 • d- : d-1 • Replace each value with the average of those values that are within one disparity over a 9 x 9 window

  45. Experiments

  46. Experiments Parameter Settings Used Throughout:

  47. Experiments Parameter Settings Used Throughout:

  48. Experiments Results on the Middlebury Data Sets with Error Threshold 1 Error% nonocc : The subset of the nonoccludedpixels disc :The subset of the pixels near the occluded areas. all : The subset of the pixels being either nonoccludedor half-occluded

More Related