1 / 36

Real-Time Rendering

Real-Time Rendering. TEXTURING Lecture 02 Marina Gavrilova. Brief Outline. Basic Concept Texture Wrapping Magnification filter Minification filter Multi-Texturing Dynamic Texture Texture mapping methods.

oren-garcia
Download Presentation

Real-Time Rendering

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Real-Time Rendering TEXTURING Lecture 02 Marina Gavrilova

  2. Brief Outline • Basic Concept • Texture Wrapping • Magnification filter • Minification filter • Multi-Texturing • Dynamic Texture • Texture mapping methods

  3. Texture: Variation of surface diffusion, color, glossiness and other reflective photometry The heart of today’s computer graphics Texture Pixel = Texel Polygon is filled using texel values. Texture coordinate associates a screen pixel with a texel Concept of Texture

  4. Texturing a surface

  5. Texture mapping

  6. Texture mapping in computer games Color texture Environment mapping Alpha mapping Bump mapping Multi-texturing

  7. Basic Texturing concept • Modify the value used in the lighting equation • Basic Texture = per pixel object color • Steps: • Assign UV texture coordinates to vertices of a polygon (i.e. triangle) • Interpolate texture UV coordinate for each fill pixel during rasterization of the triangle • Lookup texel value from texture using UV coord. • Use texel value in lighting equation

  8. Texture Method • Texture pipeline used in rendering platforms to use hardware accelerations • U,V[0,1]  scaled texel coordinate • Texture resolution is a power of 2 (i.e. 256 x 256 texels)

  9. Texture Pipeline • Compute object space location • Use projector function to find (u,v) • Use corresponding functions to find texel • Apply value transform function • Modify illumination equation value

  10. Texture Rendering

  11. Texture modes • When UV value goes beyond the range [0,1] • i.e. uv= (-1,-1)  (2,2) (see below) • Allow simple and small textures to render large complex objects Tile Mirror Clamp Border

  12. Three dimensional texture coordinate (u,v,w) • 2D texture wrapping can be difficult for complex object • 3D texture mapping use (u,v,w) coordinate for texture address • Result is uniform texture distribution on a surface

  13. Texture magnification • When 1 texel = n pixels (n>1) • Visual Pixelation effect prominent • Solution: apply magnification filter Bilinear No Filter Bilinear No Filter

  14. Bilinear Filtering • Four pixel linear Interpolation (2x2) • Fast but low quality

  15. Bicubic • Not yet used in realtime graphics • Up to 4x4 pixel interpolation • Good quality • Slow

  16. Minification filter • When n texel = 1 pixel • Aliasing distortion is present • Annoying artifact during animation • Solution: MIP mapping, min-filters

  17. Min filters • Nearest • MIP • Summed area

  18. MIP Mapping (Base Concept) • Have multiple copy of a texture with reduced resolution (factor of 2) • I.e. (256x256), (128,128), (64,64),…(1,1) • Determine LOD from neighboring pixel UV difference (spacing)

  19. Anisotropic Filtering Trilinear MIP-MAPPING Anisotropic Filtering

  20. Texture caching and compression • Complex scene require large amount of texture • Texture Memory is limited • Texture management: Load only smaller LODs for distant objects • Use Clip-maps: load a small segment of high LOD texture • S3TC (6:1) compression  DirectX standard • Use two colors for a 4 color ramp • Represent 16 pixels using 4 colors • 2 bit per pixel, 16 bit per color  Average 4 bit per pixel

  21. Multi-pass rendering • Rendering the same geometry several times • Integrate different photometric components • I.e. Quake III has 10 rendering passes

  22. Multi-texturing • Combine two or more textures • Could be performed in one rendering pass • Pixel value is computed using values from different texture stages

  23. Multi-texture & Pixel-shader

  24. Texture Effects Variety of realistic 3D effects can be created by manipulating textures of a 3D geometry

  25. Texture Animation • Dynamic animated texture • Store combined static frames as 1 image • Use different UVs to animate frames UV=(0.25,0.25) UV=(0.5,0.5) Frame 5

  26. Transparency Alpha mapping = +

  27. Billboarding • Create complex objects using texture and simple geometry • Used extensively in particle rendering systems • Use faceted quads  normal of the quad equals camera direction • Used frequently in games • High performance

  28. Multi Textured Gloss Mapping

  29. Reflection/Environment mapping Blinn & Newells method:

  30. Other environment mapping Cube Mapping

  31. Bump Mapping • Per-pixel normal • Use cubemaps for normal

  32. Bump mapping • Use cubemap and multi-texture pass • Or use High level shading Language (HLSL) in nvidia

  33. Other texture teqniques • Detail Textures (i.e. flight simulator) • Procedural Texture • Anti-aliasing • Motion blur • Realistic shading • Volumetric Texture • Image processing using texture passes • Exposure control (Dynamic Range)

  34. End of Lecture 02 Questions?

More Related