1 / 83

Cameras and Projections

Dan Witzner Hansen. Cameras and Projections. Outline. Previously??? Projections Pinhole cameras Perspective projection Camera matrix Camera calibration matrix Ortographic projection. Projection and perspective effects. Camera obscura.

gerry
Download Presentation

Cameras and Projections

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Dan Witzner Hansen Cameras and Projections

  2. Outline • Previously??? • Projections • Pinhole cameras • Perspective projection • Camera matrix • Camera calibration matrix • Ortographic projection

  3. Projection and perspective effects

  4. Camera obscura "Reinerus Gemma-Frisius, observed an eclipse of the sun at Louvain on January 24, 1544, and later he used this illustration of the event in his book De Radio Astronomica et Geometrica, 1545. It is thought to be the first published illustration of a camera obscura..." Hammond, John H., The Camera Obscura, A Chronicle http://www.acmi.net.au/AIC/CAMERA_OBSCURA.html

  5. Pinhole camera • Pinhole camera is a simple model to approximate imaging process, perspective projection. Virtual image pinhole If we treat pinhole as a point, only one ray from any given point can enter the camera. Fig from Forsyth and Ponce

  6. Camera obscura Jetty at Margate England, 1898. An attraction in the late 19th century Around 1870s http://brightbytes.com/cosite/collection2.html Adapted from R. Duraiswami

  7. Camera obscura at home http://room-camera-obscura.blogspot.com/ Sketch from http://www.funsci.com/fun3_en/sky/sky.htm

  8. Digital cameras • Film  sensor array • Often an array of charged coupled devices • Each CCD is light sensitive diode that converts photons (light energy) to electrons camera CCD array frame grabber optics computer

  9. Color sensing in digital cameras Bayer grid Estimate missing components from neighboring values(demosaicing) Source: Steve Seitz

  10. Pinhole size / aperture How does the size of the aperture affect the image we’d get? Larger Smaller

  11. “circle of confusion” Adding a lens • A lens focuses light onto the film • There is a specific distance at which objects are “in focus” • other points project to a “circle of confusion” in the image • Changing the shape of the lens changes this distance

  12. Lenses • A lens focuses parallel rays onto a single focal point • focal point at a distance f beyond the plane of the lens • f is a function of the shape and index of refraction of the lens • Aperture of diameter D restricts the range of rays • aperture may be on either side of the lens • Lenses are typically spherical (easier to produce) F focal point optical center (Center Of Projection)

  13. Thin lenses • Thin lens equation: • Any object point satisfying this equation is in focus • What is the shape of the focus region? • How can we change the focus region? • Thin lens applet: http://www.phy.ntnu.edu.tw/java/Lens/lens_e.html (by Fu-Kwun Hwang )

  14. Focus and depth of field Image credit: cambridgeincolour.com

  15. Focus and depth of field • Depth of field: distance between image planes where blur is tolerable Thin lens: scene points at distinct depths come in focus at different image planes. (Real camera lens systems have greater depth of field.) “circles of confusion” Shapiro and Stockman

  16. Depth of field • Changing the aperture size affects depth of field • A smaller aperture increases the range in which the object is approximately in focus f / 5.6 f / 32 Flower images from Wikipedia http://en.wikipedia.org/wiki/Depth_of_field

  17. Depth from focus Images from same point of view, different camera parameters 3d shape / depth estimates [figs from H. Jin and P. Favaro, 2002]

  18. Issues with digital cameras • Noise • big difference between consumer vs. SLR-style cameras • low light is where you most notice noise • Compression • creates artifacts except in uncompressed formats (tiff, raw) • Color • color fringing artifacts from Bayer patterns • Blooming • charge overflowing into neighboring pixels • In-camera processing • oversharpening can produce halos • Interlaced vs. progressive scan video • even/odd rows from different exposures • Are more megapixels better? • requires higher quality lens • noise issues • Stabilization • compensate for camera shake (mechanical vs. electronic) • More info online, e.g., • http://electronics.howstuffworks.com/digital-camera.htm • http://www.dpreview.com/

  19. Modeling Projections

  20. Perspective effects

  21. Perspective and art • Use of correct perspective projection indicated in 1st century B.C. frescoes • Skill resurfaces in Renaissance: artists develop systematic methods to determine perspective projection (around 1480-1515) Raphael Durer, 1525

  22. Modeling projection

  23. Perspective effects • Far away objects appear smaller Forsyth and Ponce

  24. Perspective effects • Parallel lines in the scene intersect in the image • Converge in image on horizon line Image plane (virtual) pinhole Scene

  25. Pinhole camera model

  26. Field of view • Angular measure of portion of 3D space seen by the camera • Depends on focal length Images from http://en.wikipedia.org/wiki/Angle_of_view

  27. Pinhole camera model

  28. Principal point offset principal point calibration matrix:

  29. Camera rotation and translation

  30. CCD Camera

  31. When is skew non-zero? arctan(1/s) g 1 for CCD/CMOS, always s=0 Image from image, s≠0 possible (non coinciding principal axis) resulting camera:

  32. Projection equation • The projection matrix models the cumulative effect of all parameters • Useful to decompose into a series of operations identity matrix intrinsics projection rotation translation Camera parameters • A camera is described by several parameters • Translation T of the optical center from the origin of world coords • Rotation R of the image plane • focal length f, principle point (x’c, y’c), pixel size (sx, sy) • blue parameters are called “extrinsics,” red are “intrinsics” • The definitions of these parameters are not completely standardized • especially intrinsics—varies from one book to another

  33. Projection properties • Many-to-one: any points along same ray map to same point in image • Points  points • Lines  lines (collinearity preserved) • Distances and angles are not preserved • Degenerate cases: – Line through focal point projects to a point. – Plane through focal point projects to line – Plane perpendicular to image plane projects to part of the image.

  34. More about camera calibration later in the course. We will see more about what information can be gathered from the images using knowledge of planes and calibrated cameras Cameras?

  35. (x,y,1) image plane The projective plane • Why do we need homogeneous coordinates? • represent points at infinity, homographies, perspective projection, multi-view relationships • What is the geometric intuition? • a point in the image is a ray in projective space -y (sx,sy,s) (0,0,0) x -z • Each point(x,y) on the plane is represented by a ray(sx,sy,s) • all points on the ray are equivalent: (x, y, 1)  (sx, sy, s)

  36. A line is a plane of rays through origin • all rays (x,y,z) satisfying: ax + by + cz = 0 l p • A line is also represented as a homogeneous 3-vector l Projective lines • What does a line in the image correspond to in projective space?

  37. l1 p l l2 Point and line duality • A line l is a homogeneous 3-vector • It is  to every point (ray) p on the line: lp=0 p2 p1 • What is the line l spanned by rays p1 and p2 ? • l is  to p1 and p2  l = p1p2 • l is the plane normal • What is the intersection of two lines l1 and l2 ? • p is  to l1 and l2  p = l1l2 • Points and lines are dual in projective space • given any formula, can switch the meanings of points and lines to get another formula

  38. (a,b,0) -y -z image plane x • Ideal line • l  (a, b, 0) – parallel to image plane Ideal points and lines • Ideal point (“point at infinity”) • p  (x, y, 0) – parallel to image plane • It has infinite image coordinates -y (sx,sy,0) x -z image plane • Corresponds to a line in the image (finite coordinates) • goes through image origin (principle point)

  39. Interpreting the Camera matrix Column Vectors p1, p2 p3 are the vanishing points along the X,Y,Z axis P4 is the camera center in world coordinates

  40. Interpreting the Camera matrix Row Vectors P3 is the principal plane containing the camera center is parallel to image plane. P1, P2 are axes planes formed by Y and X axes and camera center.

  41. Moving the camera center motion parallax epipolar line

  42. Weak perspective • Approximation: treat magnification as constant • Assumes scene depth << average distance to camera Image plane World points:

  43. Orthographic projection • Given camera at constant distance from scene • World points projected along rays parallel to optical access

  44. Orthographic (“telecentric”) lenses Navitar telecentric zoom lens http://www.lhup.edu/~dsimanek/3d/telecent.htm

  45. Correcting radial distortion from Helmut Dersch

  46. Distortion • Radial distortion of the image • Caused by imperfect lenses • Deviations are most noticeable for rays that pass through the edge of the lens No distortion Pin cushion Barrel

  47. Project to “normalized” image coordinates Apply radial distortion Apply focal length translate image center Modeling distortion • To model lens distortion • Use above projection operation instead of standard projection matrix multiplication

  48. Other camera types Plenoptic camera Time-of-flight Omnidirectional image

  49. Summary • Image formation affected by geometry, photometry, and optics. • Projection equations express how world points mapped to 2D image. • Homogenous coordinates allow linear system for projection equations. • Lenses make pinhole model practical. • Parameters (focal length, aperture, lens diameter,…) affect image obtained.

  50. What does camera calibration Give • Calculate distances to objects with known size. • Calculate angles • Rotations between two views of a plane (see later) • Perpendicular vectors • Calibration can be done automatically with multiple cameras

More Related