1 / 41

Chapter 8

Chapter 8. Discrete Techniques Buffers, images, textures, etc. Announcements. Code snippet changes Assignment 2 due today!. Buffers. Per-pixel data is stored in buffers on graphics card Color Buffer (front and back) Depth Buffer Accumulation Buffer (more bpp than Color Buffer)

eloise
Download Presentation

Chapter 8

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 8 Discrete Techniques Buffers, images, textures, etc. University of North Carolina – Chapel Hill COMP 136

  2. Announcements • Code snippet changes • Assignment 2 due today! University of North Carolina – Chapel Hill COMP 136

  3. Buffers • Per-pixel data is stored in buffers on graphics card • Color Buffer (front and back) • Depth Buffer • Accumulation Buffer (more bpp than Color Buffer) • Stencil Buffer • Others (overlay planes, auxiliary buffers, color indices) University of North Carolina – Chapel Hill COMP 136

  4. Image Formats v. Buffers • JPEG, GIF, TIFF, etc. all have a file format • Can include compression • Auxiliary data (ppi for instance) • Buffers (as we see them) are just a long series of floats, ints or bytes • unsigned char img[256][256][3]; University of North Carolina – Chapel Hill COMP 136

  5. Writing to Buffers I • Send a primitive (tri, line) to graphics card • The OpenGL pipeline interfaces with the buffers, finalizing values for buffers • What if we want direct write to buffer? • glRasterPos3f(x,y,z) • glRasterPos2i(x,y) • glBitmap(width,height,x0,y0,xi,yi,bitmap) • glDrawPixels(width,height,format,type,image) • glReadPixels(x,y,width,height,format,type,image) • glCopyPixels(x,y,width,height,buffer) University of North Carolina – Chapel Hill COMP 136

  6. Writing to Buffers II • The Writing Mode alters how such operations affect the buffer • Most common: overwrite and XOR • XOR used for swapping University of North Carolina – Chapel Hill COMP 136

  7. Complexity Problems Switching gears now • Real world exhibits high amounts of complexity • How could we model each of these? University of North Carolina – Chapel Hill COMP 136

  8. Buffer Mapping Techniques • Once we have buffers, we can use them to affect the standard pipeline • Texture maps • Paint the image onto triangles/quads • Bump maps • Alter the normal on triangle/quad • Reflection/Environment maps • Quick and dirty (i.e. inaccurate) reflections University of North Carolina – Chapel Hill COMP 136

  9. Texture Mapping I • Easiest case: Repeating University of North Carolina – Chapel Hill COMP 136

  10. Texture Mapping II • Specialized textures University of North Carolina – Chapel Hill COMP 136

  11. Texture Mapping III • Problem I • We have an image (stored as a buffer in our code) • How do we map that image onto a triangle? • HINT: Done after rasterization • From image to world space? • From world space to image? • This one done generally University of North Carolina – Chapel Hill COMP 136

  12. Texture Mapping IV • Sampling issues: • Mipmapping creates multiple resolutions of an image • Hand it 256x256, it creates 128x128, 64x64, 32x32, 16x16, 8x8, 4x4, 2x2, 1x1 • Samples these depending on polygon’s on screen size University of North Carolina – Chapel Hill COMP 136

  13. Texture Mapping V • Whole field of research in sampling and reconstruction • Beyond the scope of this class, but very instructive University of North Carolina – Chapel Hill COMP 136

  14. Assigning Texture Coordinates I • We have a square texture • How do we map it to triangles? • We want a relatively even sharing of texels on each triangle University of North Carolina – Chapel Hill COMP 136

  15. Assigning Texture Coordinates II • What effect do we want? University of North Carolina – Chapel Hill COMP 136

  16. Assigning Texture Coordinates III • The book proposes the following systems: University of North Carolina – Chapel Hill COMP 136

  17. Assigning Texture Coordinates IV University of North Carolina – Chapel Hill COMP 136

  18. Creating Textures I • Sometimes they are created by hand • Most repeating textures are done this way • The robot above was too (as well as most video games)! • Automatic generation • active research • Data gathering from real world University of North Carolina – Chapel Hill COMP 136

  19. Creating Textures II • Identical copy-paste textures are too regular • Take an example image and extend it with plausible results Courtesy of Vivek Kwatra University of North Carolina – Chapel Hill COMP 136

  20. Creating Textures III University of North Carolina – Chapel Hill COMP 136

  21. Creating Textures IV • 3D scanners can now obtain position information from real scenes • How do we color them? University of North Carolina – Chapel Hill COMP 136

  22. OpenGL Functions • glTexImage2D • glEnable(GL_TEXTURE_2D) • glTexParameter • glBindTexture • gluBuild2DMipmaps • glTexEnv • glHint(GL_PERSPECTIVE_CORRECTION, GL_NICEST) • glTexCoord2df(s,t) University of North Carolina – Chapel Hill COMP 136

  23. OpenGL Demo? • Will I do a demo? University of North Carolina – Chapel Hill COMP 136

  24. Environment Maps I • Reflection/refraction of light can be “approximated” • Draw 6 sides of box from center of box • When rendering clear/shiny surface, texture with texture coordinates on box created via reflection approximation University of North Carolina – Chapel Hill COMP 136

  25. Environment Maps II See Color Plates 29 and 30 in 4th edition Textbook for a sample environment map. University of North Carolina – Chapel Hill COMP 136

  26. Bump Maps I • If we use Phong Shading (calculating lighting at each fragment based on interpolated normal) we can bump map • Bump map perturbs the standard normal by some amount at each fragment University of North Carolina – Chapel Hill COMP 136

  27. Bump Maps II These are basic teapots and planes University of North Carolina – Chapel Hill COMP 136

  28. Compositing/Multiple Passes • Some advanced OpenGL techniques are “multi-pass” • This means you render the whole scene two (or more) times before showing the result to the viewer • OpenGL also has the ability to “Render to Texture” so that the viewer never sees the results of a pass, but can be read by later passes University of North Carolina – Chapel Hill COMP 136

  29. Alpha Blending I • glColor4f(R,G,B,A) • We’ve always set A to 1 • glEnable(GL_BLEND); • glBlendFunc(…) • You can draw to the frame buffer without overwriting! University of North Carolina – Chapel Hill COMP 136

  30. Alpha Blending II • Problems: • Describe z-buffer • What happens to the z-buffer when you draw a clear object? • Color(x,y) = (1-A)C0 + AC1 • Where C0 is the original color, C1 is the new color • What happens when multiple alpha values hit the same pixel? • Is it commutative? University of North Carolina – Chapel Hill COMP 136

  31. Alpha Blending III • So, order matters • Which is better: front to back or back to front? University of North Carolina – Chapel Hill COMP 136

  32. Depth Cueing and Fog • OpenGL “fog” or “haze” gives the appearance of objects disappearing in the distance • See p. 431 in the text for how University of North Carolina – Chapel Hill COMP 136

  33. *Not in the text Shadow Volumes I Light Source Viewer Object1 Object3 Shadow Volume Object2 Ground Plane University of North Carolina – Chapel Hill COMP 136

  34. Shadow Volumes II • Track the silhouette edges of the objects (when viewed from the light) • Render all objects to the depth buffer • Ambient light only for color • Render all shadow volume faces to the Stencil Buffer • Increment at each pixel for front face shadow vol • Decrement for back face shadow vol • Don’t write to the depth buffer…but do check with it • Render all objects again • Diffuse/Specular color but only if pixel is ZERO in Stencil Buffer • Check against the depth buffer, no need to write! University of North Carolina – Chapel Hill COMP 136

  35. Accumulation Buffer I • The color buffers (front and back) only have 8bpp for each of R, G, and B • When doing multiple passes, you may want better precision than that • Accumulation Buffer has more • How big depends on your machine University of North Carolina – Chapel Hill COMP 136

  36. Accumulation Buffer II • Motion blur • See code on page 432 of text (4th ed) • Essentially, you draw the scene multiple times each at 1/n color brightness • The standard buffer would loose a lot of the color quality University of North Carolina – Chapel Hill COMP 136

  37. Accumulation Buffer III • Image Processing • Convolution kernels - kxk array that multiplied and summed across the image • Blur kernel [1 2 1] 1/20 * [2 4 2] [1 2 1] • Sharpen kernel [0 -1 0] [-1 5 -1] [0 -1 0] University of North Carolina – Chapel Hill COMP 136

  38. Accumulation Buffer IV • Depth of Field • The blue plane is the focal plane • The things on the blue plane will always render to the same pixels • Those off the plane will blur by varying amounts (depending on distance from the plane) University of North Carolina – Chapel Hill COMP 136

  39. Antialiasing • Aliasing AKA “jaggies” • Antialiasing - smooth the jaggies • Multiple renders from slightly different gluLookAt vals into Accumulation buffer • BUT, OpenGL does it via alpha • See p 430 in the text for sample code University of North Carolina – Chapel Hill COMP 136

  40. Conclusions I • We’ve just completed what we will cover of OpenGL • There are many other techniques we haven’t covered • OpenGL is always growing with “extensions” • Full OpenGL description (-exten.) • OpenGL Programming Guide: The Official Guide to Learning OpenGL, Version 1.4, Fourth Edition University of North Carolina – Chapel Hill COMP 136

  41. Conclusions II • For real-time graphics, DirectX is another option • With what we’ve learned in this class, you could learn DX on your own • DirectX9 SDK free download: • http://msdn.microsoft.com/directx/sdk/ • Sample code and documentation comes with it University of North Carolina – Chapel Hill COMP 136

More Related