1 / 39

Filtering Environment Illumination in Mixed Reality

This review discusses axis-aligned filtering for soft shadows and adaptive image-space filtering for accurate filtering of noisy Monte Carlo image in mixed reality. It also explores factored axis-aligned filtering for rendering multiple distribution effects.

aroberto
Download Presentation

Filtering Environment Illumination in Mixed Reality

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Filtering Environment Illumination in Mixed Reality Wang Lin Graduate School of Ocean Systems Engineering 2016.06.02

  2. Review • Axis-Aligned Filtering for Soft Shadows • Based on explicit occlusion calculation by ray-tracing • Flowed by adaptive image-space filtering

  3. Two Papers • Section1 Topic: accurate filtering of noisy MC image using Fourier analysis • Title: Filtering Environment Illumination for Interactive Physically-Based Rendering in Mixed Reality(Eurographics2015) Author: Soham Uday Mehta, Kihwan Kim, Dawid Paja • Section2 Topic: Factored Axis-Aligned filtering • Title: Factored Axis-Aligned Filtering for Rendering Multiple Distribution Effects ( Siggraph 2014) Author: Soham Uday Mehta, Brandon Wang

  4. Background • What’s Mixed Reality? • Combination of Virtual Reality(VR) and Augmented Reality(AR) • Allow us to see the real world(AR) while also seeing believable, virtual objects(VR) • More Flexibility by overlaying virtual objects into real scene for immersive visual experience Challenges • Stable tracking of cameras providing a proper placement for objects. • Plausible rendering and post-processing of mixed scenes.

  5. Approaches • How can we do to tackle those challenges? • Dense simultaneous localization and mapping algorithm(SLAM) • Estimate 6 DOF camera pose • Estimate scene geometry in the form of pixel position and normals Solve 1st challenge Solve 2nd challenge • Rasterization • Noisy real-world mesh obtained directly from depth camera • MC ray-tracing a fixed pre-defined real world mesh • Two mode path-tracing method(This paper) • Use denoised real-world vertex map obtained in SLAM

  6. Background • MC Ray-tracing • Photo-realistic rendering with environment illumination • Long-time to render single image • Large number of rays to producing noise-free image • On smooth surfaces, environment illumination is slowly changing Fast rendering 1.Exploit smoothness by appropriately filtering a sparsely-sampled MC result. 2.Extend axis-aligned filtering algorithm to filter environment illumination adaptively in scree-space

  7. Differential rendering

  8. Differential rendering Diffuse Real and Virtual Objects

  9. Differential rendering Glossy Virtual Objects Spatial Texture Outgoing radiance depending on view angle

  10. Overview of MR system RGB Geometry Filtering Sampling Compositing SLAM Filter size output Input RGB-D sequence Scene structure

  11. Two-mode Sampling Algorithm Camera position Per-pixel out-going illumination Per-pixel real world positions and normals Output Input Virtual object geometry Algorithm -Compute 4 primary samples per pixel(spp) for anti-aliasing. -Determine whether a real or a virtual object is visual is visible at current sample. -Update the mask M(fraction of pixel covered by real objects). 1 Compute 4 secondary samples for each 4 primary samples for each direct and indirect illumination. - Importance sampling the environment map for direct illumination - cosine hemisphere for diffuse surfaces( real and virtual) and Phong lobe for glossy surface. 2

  12. Fourier analysis for Environment Lighting Why we need filtering? • Results are noisy , and accurate rendering Axis-aligned filter provide a simple spatial bandwidth for the shading How can we do it? • Fourier analysis of direct illumination from an environment map(occluders and visibility) • Perform 2D analysis of shading in position and angle space. • Show the shading is bandlimited by the BRDF in angular dimension • Diffuse without visibility • Diffuse with visibility • Glossy BRDF • Extend to 3D • Indirect illumination Steps

  13. Diffuse without visibility Where, Fourier transform The incoming direction Θ at x corresponds to the direction Θ +kx at the origin Line

  14. Diffuse with visibility The shape of the spectrum is no longer a simple line for a single depth occluder. Infinite occluder at depth Z Curvature

  15. Glossy BRDF Flatland shading geometry Shading spectrum

  16. Extension to 3D Pure virtual and untextured scene under an outdoor map

  17. Practical Filtering • How bandwidth can be used to filter noise? • Run a CUDA kernel to compute per-pixel screen space curvature • 2D separated filter • Do temporal filtering to eliminate some residual noise sequence • Assuming illumination does not change between consecutive frames

  18. Results

  19. Topic2 Factored Axis-aligned Filtering for Rendering Multiple Distribution Effects

  20. Rendering Distribution Effects Soft Shadows(secondary) Depth of Field(Primary) Indirect Illumination(secondary)

  21. Rendering Distribution Effects Integrate radiance over lens, light source Area Light Primary rays Direct light Lens

  22. Rendering Distribution Effects Integrate radiance over lens, light source Area Light Primary rays Direct light Lens Our goal User fewer rays where radiance doesn’t vary much Filter noise using frequency analysis

  23. Contributions • Combine 6D frequency analysis for prime and second • Pixel-lens-light for direct illumination • Pixel-lens-angle for indirect illumination • Factoring texture and irradiance to per-filter irradiance • Two-level adaptive sampling strategy • DOF+ Soft shadows +indirect illumination in 5 secs

  24. Algorithm Filter Width Texture Filter Sample 2 Irradiance Sample 1 Radiance Select Filter2 Sampling Rate Output

  25. Theory: Defocus Blur x Focal Plane f f L(x,u)= z *Assuming diffuse surface

  26. Theory: Defocus Blur Circle of confusion f f L(x,u)= z Compare to indirect light field Double edge spectrum ! *Assuming diffuse surface

  27. Axis-Aligned Filter Pixel color obtained by integrating over lens Max COC Lens aperture func. Min COC Corresponds to Fourier domain band-limiting The spatial bandwidth of axis-aligned filter is *Gaussian lens aperture

  28. Defocus Blur + Area Light Integrate over lens + light Light intensity Texture Illumination Lens aperture func. x Focal Plane k(x , u) k: Texture (V , f)(x , u , y) V: Visibility f: BRDF y

  29. Radiance Factoring

  30. Radiance Factoring Form factor changes slowly with y Extract form factor out of y integral Approximation: integral of product to product of integrals

  31. Radiance Factoring Approximation: integral of product to product of integrals Similar to ambient occlusion Used selectively ONLY when predicted error is low

  32. 3D Frequency Analysis of Visibility Visibility term is constant along planes x Focal Plane k(x , u) Lines in 3D Fourier space Known slopes (V , f)(x , u , y) y

  33. Indirect Illumination Incident Indirect Radiance is analogous to visibility x Focal Plane Filtered by both BEDF and aperture k(x , u) (V , f)(x , u , y) y

  34. Sampling Rate with Factoring Number of Primary rays only depends on lens effects Primary rays Direct light Lens

  35. Sampling Rate with Factoring Number of Primary rays only depends on lens effects How to choose secondary sampling rate? Area Light Primary rays Direct light Lens

  36. Sampling Rate with Factoring Filter size Sampling rate Indirect(hemisphere) Defocus(lens) Direct(Area light)

  37. Final Results

  38. Q/A

  39. Questions • For the differential rendering method, which one of the following choices was not included ? • Estimating pixel color considering only real objects. • Considering both real and virtual objects. • Handling glossy real objects. • Adding the difference to the raw camera image. • Which kind of effect can not be filtered in axis-aligned method for image-space filtering? • Motion blur. • Soft shadows. • Indirect illumination. • Defocus blur.

More Related