1 / 40

Seeing 3D from 2D Images

Seeing 3D from 2D Images. How to make a 2D image appear as 3D!. Output is typically 2D Images Yet we want to show a 3D world! How can we do this? We can include ‘cues’ in the image that give our brain 3D information about the scene These cues are visual depth cues. Visual Depth Cues.

nadda
Download Presentation

Seeing 3D from 2D Images

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Seeing 3D from 2D Images

  2. How to make a 2D image appear as 3D! • Output is typically 2D Images • Yet we want to show a 3D world! • How can we do this? • We can include ‘cues’ in the image that give our brain 3D information about the scene • These cues are visual depth cues

  3. Visual Depth Cues • Cues about the 3rd dimension – total of 10 • Monoscopic Depth Cues (single 2D image) [6] • Stereoscopic Depth Cues (two 2D images) [1] • Motion Depth Cues (series of 2D images) [1] • Physiological Depth Cues (body cues) [2] • Hold a finger up

  4. Monoscopic Depth Cues • Interposition • An occluding object is closer • Shading • Shape and shadows • Size • The larger object is closer • Linear Perspective • Parallel lines converge at a single point • Higher the object is (vertically), the further it is • Surface Texture Gradient • More detail for closer objects • Atmospheric effects • Further away objects are blurrier and dimmer • Images from http://ccrs.nrcan.gc.ca/resource/tutor/stereo/chap2/chapter2_5_e.php

  5. Monoscopic Depth Cues • Interposition • An object that occludes another is closer • Shading • Shape info. Shadows are included here • Size • Usually, the larger object is closer • Linear Perspective • parallel lines converge at a single point • Surface Texture Gradient • more detail for closer objects • Height in the visual field • Higher the object is (vertically), the further it is • Atmospheric effects • further away objects are blurrier • Brightness • further away objects are dimmer

  6. Stereoscopic Display Issues • Stereopsis • Stereoscopic Display Technology • Computing Stereoscopic Images • Stereoscopic Display and HTDs. • Works for objects < 5m. Why?

  7. Stereopsis The result of the two slightly different views of the world that our laterally-displaced eyes receive.

  8. Retinal Disparity If both eyes are fixated on a point, f1, in space: • Image of f1is focused at corresponding points in the center of the fovea of each eye. • f2, would be imaged at points in each eye that may at different distances from the fovea. • This difference in distance is the retinal disparity.

  9. Retinal Disparity • If an object is farther than the fixation point, the retinal disparity will be: • Positive value • Uncrossed disparity • Eyes must uncross to fixate the farther object. • If an object is closer than the fixation point, the retinal disparity will be: • Negative • Crossed disparity • Eyes must cross to fixate the closer object. • An object located at the fixation point or whose image falls on corresponding points in the two retinae has: • Zero disparity (in focus) • Question: What does this mean for rendering systems? f2 f1 - d2 - d1 + + Left Eye Right Eye Retinal disparity = d1 + d2

  10. Convergence Angles +a+c+b+d = 180 +c+d = 180 - = a+(-b) = 1+2 = Retinal Disparity f1 a D1 f2 b a D2 b c d 2 1 i

  11. Miscellaneous Eye Facts • Stereoacuity - the smallest depth that can be detected based on retinal disparity. • Visual Direction - Perceived spatial location of an object relative to an observer.

  12. Horopters f1 • Map out what points would appear at the same retinal disparity. • Horopter - the locus of points in space that fall on corresponding points in the two retinae when the two eyes binocularly fixate on a given point in space (zero disparity). • Points on the horopter appear at the same depth as the fixation point. (can’t use stereopsis. • What is the shape of a horopter? f2 d1 d2 Vieth-Mueller Circle

  13. Stereoscopic Display • Stereoscopic images are easy to do badly, hard to do well, and impossible to do correctly.

  14. Stereoscopic Displays • Stereoscopic display systems presents each eye with a slightly different view of a scene. • Time-parallel – 2 images same time • Time-multiplexed – 2 images one right after another

  15. Two Screens Each eye sees a different screen Optical system directs correct view HMD stereo Single Screen Two different images projected Images are polarized at right angles User wears polarized glasses Time Parallel Stereoscopic Display

  16. Passive Polarized Projection • Linear Polarization • Ghosting increases when you tilt head • Reduces brightness of image by about ½ • Potential Problems with Multiple Screens • Circular Polarization • Reduces ghosting • Reduces brightness • Reduces crispness

  17. Problem with Linear Polarization • With linear polarization, the separation of the left and right eye images is dependent on the orientation of the glasses with respect to the projected image. • The floor image cannot be aligned with both the side screens and the front screens at the same time.

  18. Time Multiplexed Display • Left and right-eye views of an image are computed • Alternately displayed on the screen • A shuttering system occludes the right eye when the left-eye image is being displayed

  19. Stereographics Shutter Glasses

  20. Screen Parallax Display Screen P Pleft Left eye Pright position Object with positive parallax Right eye P position Pright Pleft Object with negative parallax Pleft – Point P projected screen location as seen by left eye Pright – Point P projected screen location as seen by right eye Screen parallax - distance between Pleft and Pright

  21. Screen Parallax (cont.) p = i(D-d)/D where p is the amount of screen parallax for a point, f1, when projected onto a plane a distance d from the plane containing two eyepoints. i is the interocular distance between eyepoints and D is the distance from f1 to the nearest point on the plane containing the two eyepoints d is the distance from the eyepoint to the nearest point on the screen

  22. Display Screen P Pleft Left eye Pright position Object with positive parallax Right eye P position Pright Pleft Object with negative parallax How to create correct left- and right-eye views • What do you need to specify for most rendering engines? • Eyepoint • Look-at Point • Field-of-View or location of Projection Plane • View Up Direction

  23. Basic Perspective Projection Set Up from Viewing Paramenters Y Z X Projection Plane is orthogonal to one of the major axes (usually Z). That axis is along the vector defined by the eyepoint and the look-at point.

  24. What doesn’t work • Each view has a different projection plane • Each view will be presented (usually) on the same plane

  25. i i What Does Work

  26. Setting Up Projection Geometry No Look at point Eye Locations Yes Eye Locations Look at points

  27. Visual Angle Subtended Screen parallax is measured in terms of visual angle. This is a screen independent measure. Studies have shown that the maximum angle that a non-trained person can usually fuse into a 3D image is about 1.6 degrees. This is about 1/2 the maximum amount of retinal disparity you would get for a real scene.

  28. Accommodation/ Convergence Display Screen

  29. Position Dependence (without head-tracking)

  30. Interocular Dependance True Eyes Modeled Eyes Projection Plane Perceived Point F Modeled Point

  31. Obvious Things to Do • Head tracking • Measure User’s Interocular Distance

  32. Another Problem • Many people can not fuse stereoscopic images if you compute the images with proper eye separation! • Rule of Thumb: Compute with about ½ the real eye separation. • Works fine with HMDs but causes image stability problems with HTDs (why?)

  33. Two View Points with Head-Tracking True Eyes Modeled Eyes Projection Plane Perceived Points Modeled Point

  34. Ghosting • Affected by the amount of light transmitted by the LC shutter in its off state. • Phosphor persistence • Vertical screen position of the image.

  35. Time-parallel stereoscopic images • Image quality may also be affected by • Right and left-eye images do not match in color, size, vertical alignment. • Distortion caused by the optical system • Resolution • HMDs interocular settings • Computational model does not match viewing geometry.

  36. Motion Depth Cues • Parallax created by relative head position and object being viewed. • Objects nearer to the eye move a greater distance • (Play pulfrich video without sunglasses)

  37. Physiological Depth Cues • Accommodation – focusing adjustment made by the eye to change the shape of the lens. (up to 3 m) • Convergence – movement of the eyes to bring in the an object into the same location on the retina of each eye.

  38. Summary • Monoscopic – Interposition is strongest. • Stereopsis is very strong. • Relative Motion is also very strong (or stronger). • Physiological is weakest (we don’t even use them in VR!) • Add as needed • ex. shadows and cartoons

  39. Pulfrich Effect • Neat trick • Different levels of illumination require additional time (your frame rates differ base of amount of light) • What if we darken one image, and brighten another? • http://dogfeathers.com/java/pulfrich.html • www.cise.ufl.edu/~lok/multimedia/videos/pulfrich.avi

More Related