1 / 43

Graphics II 91.547 Image Based Rendering

Graphics II 91.547 Image Based Rendering. Session 11. A Rendering Taxonomy. The Plenoptic Function. “… the pencil of rays visible from any point in space, at any time, and over any range of wavelengths”.

bob
Download Presentation

Graphics II 91.547 Image Based Rendering

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Graphics II 91.547Image Based Rendering Session 11

  2. A Rendering Taxonomy

  3. The Plenoptic Function “… the pencil of rays visible from any point in space, at any time, and over any range of wavelengths” Given a set of discrete samples (complete or incomplete) from the plenoptic function, the goal of image-based rendering is to generate a continuous representation of that function.

  4. Movie Map(Lippman 1980) Image Find Nearest Sample Movie Storage

  5. Taxonomy of “Virtual Camera” Movement(Chen et al. 1995) • Camera Rotation • Camera fixed at a particular location • Three rotational degrees of freedom • Pitch (up and down) • Yaw (about vertical axis) • Roll (about camera axis) • Object Rotation • Camera always pointing at center of object • Viewpoint constrained to move over surface of sphere • Three angular degrees of freedom • Camera movement • Viewpoint unconstrained • Viewing direction unconstrained

  6. Environment MapsMap Geometries Sphere Cylinder Cube

  7. Quick Time VRTM(Chen 1995) 2500 Pixels 768 Pixels 2500 x 768 = 1.9 G Pixels x 3 B/pixel = 5.8 GB 10:1 compression 500 MB/panorama

  8. Image Distortion fromCylindrical Environment Map Projection Plane Pre-warped Projection onto Plane Cylindrical Environment Map

  9. Quick Time VRImage Warping for Correct Perspective View

  10. Quick Time VRPanoramic Display Process Compressed Tiles CD ROM or Hard Disk Visible Tiles Compressed Tiles Cache Main Memory Decompress Display Window Warp Offscreen Buffer Visible Region

  11. Quick Time VRAccomplishing (Limited) Camera Motion

  12. Accomplishing Camera MotionGreene&Kass (1993 Apple Tech Doc.) • Regular 3-D lattice of cubic environment maps • Each environment map is a z-buffered rendering from a discrete viewpoint • Image from a new viewpoint is generated by re-sampling the environment map • Re-sampling involves rendering the pixels in the environment maps as 3-D polygons from the new viewpoint • Rendering time proportional to the environment map resolution but independent of scene complexity • Not suitable for real-time walkthrough performance on typical desktop computers (especially in 1993!)

  13. Alternative approach:Work entirely in Image Space • Sequence of images from closely spaced viewpoints is highly coherent • Depends upon the ability to establish a pixel-by-pixel correspondence between adjacent images • Can be computed if range data and camera parameters are known (true for rendered images) • For natural images, there are several techniques including manual user intervention • Pairwise correspondence between two images can be stored as a pair of morph maps • Bi-directional maps required because of possible many to one and one to many pixel correspondences • Can be represented by graph data structure where nodes are images and arcs are bi-directional morph maps

  14. N-Dimensional Graph Data Structure Bi-directional Morph Maps Image Image Image Image Image Image Image Image Image Image Image Image Image Image Image

  15. Simple View Interpolation Reference Image 1 Reference Image 2 Corresponding Pixels Morph maps

  16. Image Overlap orImage Folding P1 P2 Reference View Interpolated View

  17. Image Holes orImage Stretching P2 P1 Reference View Interpolated View

  18. Example of Hole Region Viewpoint 1 Viewpoint 2

  19. Example of Hole RegionMinimizing by Closely Spaced Viewpoints Viewpoint 1 Viewpoint 2

  20. Source Image Viewed from Camera Moved to the Right Ref. View 1 Ref. View 2

  21. Offset Vectors for Camera Motion Morph Map

  22. Locus of Morph Map for MotionParallel to Image Plane and Floor

  23. Distortion of Intermediate Images withLinear Warp Linear path of one feature

  24. Morphing Parallel Views Reference image Interpolated image Reference image

  25. View Interpolation:The Algorithm 3 1 1 2 2

  26. Example 1 of calculated intermediate images Reference Image 1 Reference Image 2 Intermediate Views

  27. Example 2 of calculated intermediate images Reference Image 1 Interpolated Image Reference Image 2

  28. Multiple-Center-of-Projection Images(Rademacher&Bishop 1998) • Information from a set of viewpoints stored in a single image • Features • Greater connectivity information compared with collections of standard images • Greater flexibility in the acquisition of image-based datasets, e.g. sampling different portions of the scene at different resolutions

  29. Multiple-Center-of-Projection ImagesDefinition • A multiple-center-of-projection image consists of a two-dimensional image and a parameterized set of cameras meeting the following conditions: • The cameras must lie on either a continuous curve or a continuous surface • Each pixel is acquired by a single camera • Viewing rays vary continuously across neighboring pixels • Two neighboring pixels must either correspond to the same camera or to neighboring cameras • Each pixel contains range information

  30. MCOP Image

  31. Strip Camera used for Capture of Real MCOP Images

  32. Camera Path in Capturing MCOP Image of Castle

  33. Image Plane for Camera Motion

  34. Resulting 1000 x 500 MCOP Image

  35. Reprojection Camera model, stored per column: Center of projection Vector from to image plane origin Horizontal axis of viewing plane Vertical axis of viewing plane Disparity = distance from to the image plane divided by distance from to the pixel’s world space point Reprojection Formula:

  36. View of Castle Reconstructed from MCOP Image

  37. AnotherView of Castle Reconstructed from MCOP Image

  38. Lumigraphs • Lumigraph = a representation of the light resulting from a scene • Limited data representation of the plenoptic function • Generated from multiple images and camera “poses” • Rendering: Image = Lumigraph + Camera Model • Special case of 4D Light Field (Levoy, Hanrahan)

  39. What is a Lumigraph? For all points on the surrounding surface, For all directions, The color intensity of the ray. Assumption: We are outside a convex hull containing the objects

  40. Parameterization of the Lumigraph Images from Steven Gortler, SIGGRAPH 1999

  41. Building the Lumigraph

  42. Approximating the LumigraphWith Discrete Samples

  43. Views of a Light Field (Lumigraph) Levoy & Hanrahan, Light Field Rendering, Computer Graphics

More Related