robot vision lesson 1a structured light 3d reconstruction matthias r ther christian reinbacher
Download
Skip this Video
Download Presentation
ROBOT VISION Lesson 1a: Structured Light 3D Reconstruction Matthias Rüther, Christian Reinbacher

Loading in 2 Seconds...

play fullscreen
1 / 18

ROBOT VISION Lesson 1a: Structured Light 3D Reconstruction Matthias Rüther, Christian Reinbacher - PowerPoint PPT Presentation


  • 129 Views
  • Uploaded on

ROBOT VISION Lesson 1a: Structured Light 3D Reconstruction Matthias Rüther, Christian Reinbacher. Structured Light Methods. Goal: Robust 3D Reconstruction through triangulation Project artificial pattern on the object Pattern alleviates the correspondence problem Variants:

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' ROBOT VISION Lesson 1a: Structured Light 3D Reconstruction Matthias Rüther, Christian Reinbacher' - makelina-justin


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
robot vision lesson 1a structured light 3d reconstruction matthias r ther christian reinbacher
ROBOT VISIONLesson 1a: Structured Light 3D ReconstructionMatthias Rüther, Christian Reinbacher
structured light methods
Structured Light Methods
  • Goal: Robust 3D Reconstruction through triangulation
  • Project artificial pattern on the object
  • Pattern alleviates the correspondence problem
  • Variants:
    • Laser Pattern (point, line)
    • Structured projector pattern (several lines, pattern sequence)
    • Random projector pattern
structured light range finder
Structured Light Range Finder

1. Sender (projects plane)

2. Receiver (CCD Camera)

Sensor image

Geometry Z- direction

X- direction

slide4

1 plane -> 1 object profile

  • To get a 3D profile:
    • Move the object
    • Scanning Unit for projected plane
    • Move the Sensor

Object motion by conveyor band:

=> synchronization: measure distance along conveyor

=> y-accuracy determined by distance measurement

Scanning Units (e.g.: rotating mirror) are rare (accurate measurement of mirror motion is hard, small inaccuracy there -> large inaccuracy in geometry

Move the sensor: e.g. railways: sensor in wagon coupled to speed measurement

commercially available
Commercially Available

Person Scanners

Cultural Heritage

Rapid Prototyping

problems of laser profile
Problems of Laser Profile
  • Occlusions:

Object points need to be seen from Laser and Camera viewpoint

  • Sharpness and Contrast:

Both camera and laser need to be in focus

  • Speckle noise:

Laser always shows “speckle noise”, caused by interference of coherent light.

-> where is the center of the stripe?

multiple sheets of light
Multiple Sheets of Light

Project multiple Laser planes simultaneously to reduce measurement time.

Problem:

Separation of stripes in the image

Application:

Smoothness check of flat surfaces

slide9

Pattern projection

Range Image

Projected light stripes

  • Camera: IMAG CCD, Res:750x590, f:16 mm
  • Projector: Liquid Crystal Display (LCD 640), f: 200mm, Distance to object plane: 120cm
projector
Projector

Lamp

Lens system

LCD - Shutter

Pattern structure

Line projector (e.g.: LCD-640)

Focusing lens (e.g.: 150mm)

Example

depth decoding
Depth decoding

Project Temporal sequenceof n binarymasks. Ateachpixel, the temporal sequenceofintensities (I1, …, In) gives a binarynumberwhichdenotedthecorrespondingprojectorcolumn.

Project  Acquire  Decode  Triangulate

coded light phase shift
Coded Light + Phase Shift

Binary code is limited to pixel accuracy (or less).

Increase accuracy to sub-pixel by projecting sine wave after code and measuring phase shift between projected and captured pattern. Decode phase from four samples of sine period, shifted by pi/2.

coded light phase shift1
Coded Light + Phase Shift

Increase accuracy to sub-pixel by projecting sine wave after code and measuring phase shift between projected and captured pattern. Decode phase from four samples of sine period, shifted by pi/2.

code

Image column (x)

phase

+

2

0

Image column (x)

other coding methods possible
Other Coding Methods Possible

Joaquim Salvi,

Pattern codification strategies in structured light systems

the kinect working principle
The Kinect Working Principle
  • Triangulation based depth sensor
  • Static pattern projection
  • Heavy exploitation of redundancy
  • Extremely robust/conservative depth maps
the sensor system
The Sensor System

IR Lens:

F~6mm FOV~55°

Diffractive Optical

Element (DOE)

Laser

830nm, 60mW

class 3B without optics, 1 with optics,

no amplitude modulation

IR Bandpass

RGB Lens:

F~2.9mm, FOV~65°

IR Camera:

CMOS, rolling shutter, 1.3MP, ½“, 10bit

RGB Camera:

CMOS, rolling shutter, 1.3MP, 1/4“, 10bit

Peltier Element

Temperature Stabilization

Stereo Processor

Microphone Array

Accelerometer

Tilt Axis

the sensor system1
The Sensor System
  • Tx ~75mm
  • DOF 0.5m – 8m
  • FOV ~55°
  • Res. 640x480 (at most)
  • Internal max 1280x1024

Stereo Processor

Microphone Array

Accelerometer

Tilt Axis

the projection pattern
The Projection Pattern

IR Laser and Diffractive Optical Element create interference pattern

Pattern is static and identical for all Kinects

ad