1 / 61

776 Computer Vision

776 Computer Vision. Jan-Michael Frahm , Enrique Dunn Spring 2012. From Previous Lecture. Homographies Fundamental matrix Normalized 8-point Algorithm Essential Matrix. Plane Homography for Calibrated Cameras. For the plane at infinity H = K’RK -1. In the calibrated case

hastin
Download Presentation

776 Computer Vision

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 776 Computer Vision Jan-Michael Frahm, Enrique Dunn Spring 2012

  2. From Previous Lecture Homographies Fundamental matrix Normalized 8-point Algorithm Essential Matrix

  3. Plane Homography for Calibrated Cameras • For the plane at infinity H = K’RK-1 • In the calibrated case • Two cameras P=K[I |0] and P’ = K’[R | t] • A plane π=(nT,d) T • The homography is given by x’=Hx H = K’(R – tnT/d)K-1

  4. The Fundamental Matrix F L M l1 Hm0 F = [e]xH = Fundamental Matrix P0 m0 M e1 Epipole m1 P1

  5. The eight-point algorithm Minimize: under the constraintF33= 1 x = (u, v, 1)T, x’ = (u’, v’, 1)T

  6. Epipolar constraint: Calibrated case X x x’ Essential Matrix (Longuet-Higgins, 1981) The vectors x, t, and Rx’ are coplanar slide: S. Lazebnik

  7. Epipolar constraint: Calibrated case X x x’ Cubic constraint Essential Matrix The vectors x, t, and Rx’ are coplanar slide: S. Lazebnik

  8. Today: Binocular stereo • Given a calibrated binocular stereo pair, fuse it to produce a depth image Where does the depth information come from?

  9. Binocular stereo • Given a calibrated binocular stereo pair, fuse it to produce a depth image • Humans can do it Stereograms: Invented by Sir Charles Wheatstone, 1838

  10. Binocular stereo • Given a calibrated binocular stereo pair, fuse it to produce a depth image • Humans can do it Autostereograms: www.magiceye.com

  11. Binocular stereo • Given a calibrated binocular stereo pair, fuse it to produce a depth image • Humans can do it Autostereograms: www.magiceye.com

  12. Real-time stereo Used for robot navigation (and other tasks) Software-based real-time stereo techniques • Nomad robot searches for meteorites in Antartica • http://www.frc.ri.cmu.edu/projects/meteorobot/index.html slide: R. Szeliski

  13. Stereo image pair slide: R. Szeliski

  14. Public Library, Stereoscopic Looking Room, Chicago, by Phillips, 1923 Anaglyphs http://www.rainbowsymphony.com/freestuff.html (Wikipedia for images) slide: R. Szeliski

  15. Stereo: epipolar geometry epipolar line epipolar plane viewing ray • Match features along epipolar lines slide: R. Szeliski

  16. Simplest Case: Parallel images • Image planes of cameras are parallel to each other and to the baseline • Camera centers are at same height • Focal lengths are the same slide: S. Lazebnik

  17. Simplest Case: Parallel images • Image planes of cameras are parallel to each other and to the baseline • Camera centers are at same height • Focal lengths are the same • Then, epipolar lines fall along the horizontal scan lines of the images slide: S. Lazebnik

  18. Essential matrix for parallel images Epipolar constraint: R = I t = (T, 0, 0) x x’ t

  19. Essential matrix for parallel images Epipolar constraint: R = I t = (T, 0, 0) x x’ t The y-coordinates of corresponding points are the same!

  20. Depth from disparity X z x x’ f f BaselineB O O’ Disparity is inversely proportional to depth!

  21. Depth Sampling Depth sampling for integer pixel disparity Quadratic precision loss with depth!

  22. Depth Sampling Depth sampling for wider baseline

  23. Depth Sampling Depth sampling is in O(resolution6)

  24. Stereo: epipolar geometry • for two images (or images with collinear camera centers), can find epipolar lines • epipolar lines are the projection of the pencil of planes passing through the centers • Rectification: warping the input images (perspective transformation) so that epipolar lines are horizontal slide: R. Szeliski

  25. Rectification • Project each image onto same plane, which is parallel to the epipole • Resample lines (and shear/stretch) to place lines in correspondence, and minimize distortion • [Loop and Zhang, CVPR’99] slide: R. Szeliski

  26. Rectification BAD! slide: R. Szeliski

  27. Rectification GOOD! slide: R. Szeliski

  28. Problem: Rectification for forward moving cameras • Required image can become very large (infinitely large) when the epipole is in the image • Alternative rectifications are available using epipolar lines directly in the images • Pollefeys et al. 1999, “A simple and efficient method for general motion”, ICCV

  29. Your basic stereo algorithm For each epipolar line For each pixel in the left image • Improvement: match windows • This should look familar... • compare with every pixel on same epipolar line in right image • pick pixel with minimum match cost slide: R. Szeliski

  30. Finding correspondences • apply feature matching criterion (e.g., correlation or Lucas-Kanade) at all pixels simultaneously • search only over epipolar lines (many fewer candidate positions) slide: R. Szeliski

  31. Correspondence search Left Right • Slide a window along the right scanline and compare contents of that window with the reference window in the left image • Matching cost: SSD or normalized correlation scanline Matching cost disparity slide: S. Lazebnik

  32. Correspondence search Left Right scanline SSD slide: S. Lazebnik

  33. Correspondence search Left Right scanline Norm. corr slide: S. Lazebnik

  34. Neighborhood size • Smaller neighborhood: more details • Larger neighborhood: fewer isolated mistakes • w = 3 w = 20 slide: R. Szeliski

  35. Matching criteria • Raw pixel values (correlation) • Band-pass filtered images [Jones & Malik 92] • “Corner” like features [Zhang, …] • Edges [many people…] • Gradients [Seitz 89; Scharstein 94] • Rank statistics [Zabih & Woodfill 94] • Intervals [Birchfield and Tomasi 96] • Overview of matching metrics and their performance: • H. Hirschmüller and D. Scharstein, “Evaluation of Stereo Matching Costs on Images with Radiometric Differences”, PAMI 2008 slide: R. Szeliski

  36. Adaptive Weighting Boundary Preserving More Costly

  37. Failures of correspondence search Occlusions, repetition Textureless surfaces Non-Lambertian surfaces, specularities slide: S. Lazebnik

  38. Stereo: certainty modeling • Compute certainty map from correlations • input depth map certainty map slide: R. Szeliski

  39. Results with window search Data Window-based matching Ground truth slide: S. Lazebnik

  40. Better methods exist... Ground truth Graph cuts • Y. Boykov, O. Veksler, and R. Zabih, Fast Approximate Energy Minimization via Graph Cuts, PAMI 2001 • For the latest and greatest: http://www.middlebury.edu/stereo/ slide: S. Lazebnik

  41. How can we improve window-based matching? • The similarity constraint is local (each reference window is matched independently) • Need to enforce non-local correspondence constraints slide: S. Lazebnik

  42. Non-local constraints • Uniqueness • For any point in one image, there should be at most one matching point in the other image slide: S. Lazebnik

  43. Non-local constraints • Uniqueness • For any point in one image, there should be at most one matching point in the other image • Ordering • Corresponding points should be in the same order in both views slide: S. Lazebnik

  44. Non-local constraints • Uniqueness • For any point in one image, there should be at most one matching point in the other image • Ordering • Corresponding points should be in the same order in both views Ordering constraint doesn’t hold slide: S. Lazebnik

  45. Non-local constraints • Uniqueness • For any point in one image, there should be at most one matching point in the other image • Ordering • Corresponding points should be in the same order in both views • Smoothness • We expect disparity values to change slowly (for the most part) slide: S. Lazebnik

  46. Scanline stereo Left image Right image • Try to coherently match pixels on the entire scanline • Different scanlines are still optimized independently slide: S. Lazebnik

  47. “Shortest paths” for scan-line stereo Left image Right image Right occlusion q Left occlusion t s p correspondence • Can be implemented with dynamic programming • Ohta & Kanade ’85, Cox et al. ‘96 Slide credit: Y. Boykov

  48. Coherent stereo on 2D grid • Scanline stereo generates streaking artifacts • Can’t use dynamic programming to find spatially coherent disparities/ correspondences on a 2D grid slide: S. Lazebnik

  49. Stereo matching as energy minimization I2 D I1 • Energy functions of this form can be minimized using graph cuts W1(i) W2(i+D(i)) D(i) data term smoothness term • Y. Boykov, O. Veksler, and R. Zabih, Fast Approximate Energy Minimization via Graph Cuts, PAMI 2001 slide: S. Lazebnik

  50. Active stereo with structured light camera projector • Project “structured” light patterns onto the object • Simplifies the correspondence problem • Allows us to use only one camera L. Zhang, B. Curless, and S. M. Seitz. Rapid Shape Acquisition Using Color Structured Light and Multi-pass Dynamic Programming.3DPVT 2002 slide: S. Lazebnik

More Related