1 / 45

Parameterized Environment Maps

Parameterized Environment Maps. Ziyad Hakura, Stanford University John Snyder, Microsoft Research Jed Lengyel, Microsoft Research. Static Environment Maps (EMs). Generated using standard techniques: Photograph a physical sphere in an environment Render six faces of a cube from object center.

luz
Download Presentation

Parameterized Environment Maps

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Parameterized Environment Maps Ziyad Hakura, Stanford University John Snyder, Microsoft Research Jed Lengyel, Microsoft Research

  2. Static Environment Maps (EMs) • Generated using standard techniques: • Photograph a physical sphere in an environment • Render six faces of a cube from object center

  3. Ray-Traced vs. Static EM Self-reflections are missing

  4. ParameterizedEnvironment Maps (PEM)

  5. 3-Step Process • 1) Preprocess:Ray-trace images at each viewpoint • 2) Preprocess:Infer environment maps (EMs) • 3) Run-time:Blend between 2 nearest EMs

  6. Environment Map Geometry

  7. Why Parameterized Environment Maps? • Captures view-dependent shading in environment • Accounts for geometric error due to approximation • of environment with simple geometry

  8. How to Parameterize the Space? • Experimental setup • 1D view space • 1˚ separation between views • 100 sampled viewpoints • In general, author specifies parameters • Space can be 1D, 2D or more • Viewpoint, light changes, object motions

  9. Ray-Traced vs. PEM Closely match local reflections like self-reflections

  10. Movement Away from Viewpoint Samples Ray-Traced PEM

  11. Previous Work • Reflections on Planar Surfaces [Diefenbach96] • Reflections on Curved Surfaces [Ofek98] • Image-Based Rendering Methods • Light Field, Lumigraph, Surface Light Field, LDIs • Decoupling of Geometry and Illumination • Cabral99, Heidrich99 • Parameterized Texture Maps [Hakura00]

  12. Surface Light Fields [Miller98,Wood00] Surface Light Field PEM • Dense sampling over • surface points of • low-resolution lumispheres • Sparse sampling over • viewpoints of • high-resolution EMs

  13. Light View Parameterized Texture Maps [Hakura00] Captures realistic pre-rendered shading effects

  14. Comparison withParameterized Texture Maps • Parameterized Texture Maps [Hakura00] • Static texture coordinates • Pasted-on look away from sampled views • Parameterized Environment Maps • Bounce rays off, intersect simple geometry • Layered maps for local and distant environment • Better quality away from sampled views

  15. EM Representations • EM Geometry • How reflected environment is approximated • Examples: • Sphere at infinity • Finite cubes, spheres, and ellipsoids • EM Mapping • How geometry is represented in a 2D map • Examples: • Gazing ball (OpenGL) mapping • Cubic mapping

  16. Layered EMs • Segment environment into local and distant maps • Allows different EM geometries in each layer • Supports parallax between layers

  17. Distant Local Color Local Alpha Segmented, Ray-Traced Images Fresnel EMs are inferred for each layer separately

  18. Distant Layer Ray directly reaches distant environment

  19. Distant Layer Ray bounces more times off reflector

  20. Distant Layer Ray propagated through reflector

  21. Local Color Local Alpha Local Layer

  22. Fresnel Layer Fresnel modulation is generated at run-time

  23. A x = b HW Filter Coefficients UnknownEM Texels Ray-Traced Image EM Texture Screen Hardware Render EM Inference

  24. Local Alpha Local Color Distant Inferred EMs per Viewpoint

  25. Run-Time • “Over” blending mode to composite local/distant layers • Fresnel modulation, F, generated on-the-fly per vertex • Blend between neighboring viewpoint EMs • Teapot object requires 5 texture map accesses: • 2 EMs (local/distant layers) at each of • 2 viewpoints (for smooth interpolation) and • 1 1D Fresnel map (for better polynomial interpolation)

  26. Video Results • Experimental setup • 1D view space • 1˚ separation between views • 100 sampled viewpoints

  27. Layered PEM vs. Infinite Sphere PEM

  28. Real-time Demo

  29. Summary • Parameterized Environment Maps • Layered • Parameterized by viewpoint • Inferred to match ray-traced imagery • Accounts for environment’s • Geometry • View-dependent shading • Mirror-like, localreflections • Hardware-accelerated display

  30. Future Work • Placement/partitioning of multiple environment shells • Automatic selection of EM geometry • Incomplete imaging of environment “off the manifold” • Refractive objects • Glossy surfaces

  31. Questions

  32. On the Manifold Off the Manifold 2 3 texgen time 35ms 35ms frame time 45ms 57ms FPS 22 17.5 Timing Results #geometry passes

  33. Texture Screen Hardware Render Texel Impulse Response • To measure the hardware impulse response, render with a single texel set to 1.

  34. Single Texel Response

  35. one column per texel Model for Single Texel one row per screen pixel

  36. Model for MIPMAPs

  37. Conclusion • PEMs provide: • faithful approximation to ray-traced • images at pre-rendered viewpoint samples • plausible movement away from those samples • using real-time graphics hardware

  38. PEM vs. Static EM

More Related