1 / 38

Structured light and active ranging techniques Class 11

Structured light and active ranging techniques Class 11. last Tuesday: stereo. per-pixel optimization. per-scanline optimization. full image optimization. original image pair. planar rectification. polar rectification. Plane-sweep multi-view matching.

hillwillie
Download Presentation

Structured light and active ranging techniques Class 11

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Structured light and active ranging techniquesClass 11

  2. last Tuesday: stereo per-pixel optimization per-scanline optimization full image optimization

  3. original image pair planar rectification polar rectification

  4. Plane-sweep multi-view matching • Simple algorithm for multiple cameras • No rectification necessary, but also no gain • Doesn’t deal with occlusions Collins’96; Roy and Cox’98 (GC); Yang et al.’02/’03 (GPU)

  5. Today’s class • unstructured light • structured light • time-of-flight (some slides from Szymon Rusinkiewicz, Brian Curless)

  6. A Taxonomy

  7. A taxonomy

  8. Unstructured light project texture to disambiguate stereo

  9. Space-time stereo Davis, Ramamoothi, Rusinkiewicz, CVPR’03

  10. Space-time stereo Davis, Ramamoothi, Rusinkiewicz, CVPR’03

  11. Space-time stereo Zhang, Curless and Seitz, CVPR’03

  12. Space-time stereo Zhang, Curless and Seitz, CVPR’03 • results

  13. Triangulation

  14. Triangulation: Moving theCamera and Illumination • Moving independently leads to problems with focus, resolution • Most scanners mount camera and light source rigidly, move them as a unit

  15. Triangulation: Moving theCamera and Illumination

  16. Triangulation: Moving theCamera and Illumination (Rioux et al. 87)

  17. Laser Camera Triangulation: Extending to 3D • Possibility #1: add another mirror (flying spot) • Possibility #2: project a stripe, not a dot Object

  18. Triangulation Scanner Issues • Accuracy proportional to working volume(typical is ~1000:1) • Scales down to small working volume(e.g. 5 cm. working volume, 50 m. accuracy) • Does not scale up (baseline too large…) • Two-line-of-sight problem (shadowing from either camera or laser) • Triangulation angle: non-uniform resolution if too small, shadowing if too big (useful range: 15-30)

  19. Triangulation Scanner Issues • Material properties (dark, specular) • Subsurface scattering • Laser speckle • Edge curl • Texture embossing

  20. Space-time analysis Curless ‘95

  21. Space-time analysis Curless ‘95

  22. Projector as camera

  23. Multi-Stripe Triangulation • To go faster, project multiple stripes • But which stripe is which? • Answer #1: assume surface continuity e.g. Eyetronics’ ShapeCam

  24. Real-time system Koninckx and Van Gool

  25. Multi-Stripe Triangulation • To go faster, project multiple stripes • But which stripe is which? • Answer #2: colored stripes (or dots)

  26. Multi-Stripe Triangulation • To go faster, project multiple stripes • But which stripe is which? • Answer #3: time-coded stripes

  27. Time-Coded Light Patterns • Assign each stripe a unique illumination codeover time [Posdamer 82] Time Space

  28. An idea for a project? Bouget and Perona, ICCV’98

  29. Pulsed Time of Flight • Basic idea: send out pulse of light (usually laser), time how long it takes to return

  30. Pulsed Time of Flight • Advantages: • Large working volume (up to 100 m.) • Disadvantages: • Not-so-great accuracy (at best ~5 mm.) • Requires getting timing to ~30 picoseconds • Does not scale with working volume • Often used for scanning buildings, rooms, archeological sites, etc.

  31. Depth cameras 2D array of time-of-flight sensors e.g. Canesta’s CMOS 3D sensor jitter too big on single measurement, but averages out on many (10,000 measurements100x improvement)

  32. Depth cameras 3DV’s Z-cam Superfast shutter + standard CCD • cut light off while pulse is coming back, then I~Z • but I~albedo (use unshuttered reference view)

  33. AM Modulation Time of Flight • Modulate a laser at frequencym ,it returns with a phase shift  • Note the ambiguity in the measured phase! Range ambiguity of 1/2mn

  34. AM Modulation Time of Flight • Accuracy / working volume tradeoff(e.g., noise ~ 1/500 working volume) • In practice, often used for room-sized environments (cheaper, more accurate than pulsed time of flight)

  35. Shadow Moire

  36. Depth from focus/defocus Nayar’95 Nov. 8, don’t miss Distinguished lecture!

  37. Next class: structure from motion

More Related