1 / 52

CS 445 / 645 Introduction to Computer Graphics

CS 445 / 645 Introduction to Computer Graphics. Lecture 20 Antialiasing. Environment Mapping. Used to model a object that reflects surrounding textures to the eye Polished sphere reflects walls and ceiling textures Cyborg in Terminator 2 reflects flaming destruction

ahanu
Download Presentation

CS 445 / 645 Introduction to Computer Graphics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS 445 / 645Introduction to Computer Graphics Lecture 20 Antialiasing

  2. Environment Mapping • Used to model a object that reflects surrounding textures to the eye • Polished sphere reflects walls and ceiling textures • Cyborg in Terminator 2 reflects flaming destruction • Texture is distorted fish-eye view of environment • Spherical texture mapping creates texture coordinates that correctly index into this texture map

  3. Materials from NVidia • http://developer.nvidia.com/object/Cube_Mapping_Paper.html

  4. Sphere Mapping

  5. Blinn/Newell Lattitude Mapping

  6. Cube Mapping

  7. Multitexturing • Pipelining of multiple texture applications to one polygon • The results of each texture unit application is passed to the next texture unit, which adds its effects • More bookkeeping is required to pull this off

  8. Antialiasing

  9. What is a pixel? • A pixel is not… • A box • A disk • A teeny tiny little light • A pixel is a point • It has no dimension • It occupies no area • It cannot be seen • It can have a coordinate A pixel is more than a point, it is a sample

  10. Samples • Most things in the real world are continuous • Everything in a computer is discrete • The process of mapping a continuous function to a discrete one is called sampling • The process of mapping a continuous variable to a discrete one is called quantization • Rendering an image requires sampling and quantization

  11. Samples

  12. Samples

  13. Line Segments • We tried to sample a line segment so it would map to a 2D raster display • We quantized the pixel values to 0 or 1 • We saw stair steps, or jaggies

  14. Line Segments • Instead, quantize to many shades • But what sampling algorithm is used?

  15. Area Sampling • Shade pixels according to the area covered by thickened line • This is unweighted area sampling • A rough approximation formulated by dividing each pixel into a finer grid of pixels

  16. Unweighted Area Sampling • Primitive cannot affect intensity of pixel if it does not intersect the pixel • Equal areas cause equal intensity, regardless of distance from pixel center to area

  17. Weighted Area Sampling • Unweighted sampling colors two pixels identically when the primitive cuts the same area through the two pixels • Intuitively, pixel cut through the center should be more heavily weighted than one cut along corner

  18. Intensity W(x,y) x Weighted Area Sampling • Weighting function, W(x,y) • specifies the contribution of primitive passing through the point (x, y) from pixel center

  19. Images • An image is a 2D function I(x, y) that specifies intensity for each point (x, y)

  20. Sampling and Image • Our goal is to convert the continuous image to a discrete set of samples • The graphics system’s display hardware will attempt to reconvert the samples into a continuous image: reconstruction

  21. Point Sampling an Image • Simplest sampling is on a grid • Sample dependssolely on valueat grid points

  22. Point Sampling • Multiply sample grid by image intensity to obtain a discrete set of points, or samples. Sampling Geometry

  23. Sampling Errors • Some objects missed entirely, others poorly sampled

  24. Fixing Sampling Errors • Supersampling • Take more than one sample for each pixel and combine them • How many samples is enough? • How do we know no features are lost? 150x15 to 100x10 200x20 to 100x10 300x30 to 100x10 400x40 to 100x10

  25. Unweighted Area Sampling • Average supersampled points • All points are weighted equally

  26. Weighted Area Sampling • Points in pixel are weighted differently • Flickering occurs as object movesacross display • Overlapping regions eliminates flicker

  27. Signal Theory • Convert spatial signal to frequency domain Intensity Pixel position across scanline Example from Foley, van Dam, Feiner, and Hughes

  28. Signal Theory • Represent spatial signal as sum of sine waves (varying frequency and phase shift) • Very commonlyused to representsound “spectrum”

  29. Fourier Analysis • Convert spatial domain to frequency domain • Let f(x) indicate the intensity at a location in space, x (pixel value) • u is a complex number representing frequency and phase shift • i = sqrt (-1) … frequently not plotted • F(u) is the amplitude of a particular frequency in a signal • In this case the signal is f(x)

  30. Fourier Transform • Examples of spatial and frequency domains

  31. Nyquist Sampling Theorem • The ideal samples of a continuous function contain all the information in the original function if and only if the continuous function is sampled at a frequency greater than twice the highest frequency in the function

  32. Nyquist Rate • The lower bound on the sampling rate equals twice the highest frequency component in the image’s spectrum • This lower bound is the Nyquist Rate

  33. Band-limited Signals • If you know a function contains no components of frequencies higher than x • Band-limited implies original function will not require any ideal functions with frequencies greater than x • This facilitates reconstruction

  34. Flaws with Nyquist Rate • Samples may not align with peaks

  35. Flaws with Nyquist Rate • When sampling below Nyquist Rate, resulting signal looks like a lower-frequency one • With no knowledge of band-limits, samples could have been derived from signal of higher frequency

  36. Low-pass Filtering • We know we are limited in the resolution of our screen • We want the screen (sampling grid) to have twice the resolution of the signal (image) we want to display • How can we reduce the high-frequencies of the image? • Low-pass filter • Band-limits the image

  37. Low-pass Filtering • In frequency domain • If signal is F(u) • Just chop off parts of F(u) in high frequencies using a second function, G(u) • G(u) == 1 when –k <= u <= k 0 elsewhere • This is called the pulse function

  38. Low-pass Filtering • In spatial domain • Multiplying two Fourier transforms in the spatial domain corresponds exactly to performing an operation called convolution in the spatial domain • f(x) * g(x) = h(x)  the convolution of f with g… • The value of h(x) at x is the integral of the product of f(x) with the filter g(x) such that g(x) is centered at x • The pulse (frequency) == sinc (spatial)

  39. Low-pass Filtering • In spatial domain • Sinc: sinc(x) = sin (px)/px • Note this isn’t perfect way to eliminate high frequencies • “ringing” occurs

  40. Sinc Filter • Slide filter along spatial domain and compute new pixel value that results from convolution

  41. Bilinear Filter • Sometimes called a tent filter • Easy to compute • just linearly interpolate between samples • Finite extent and no negative values • Still has artifacts

  42. Sampling Pipeline

  43. How is this done today?Full Screen Antialiasing • Nvidia GeForce2 • OpenGL: render image 400% larger and supersample • Direct3D: render image 400% - 1600% larger • Nvidia GeForce3 • Multisampling but with fancy overlaps • Don’t render at higher resolution • Use one image, but combine values of neighboring pixels • Beware of recognizable combination artifacts • Human perception of patterns is too good

  44. GeForce3 • Multisampling • After each pixel is rendered, write pixel value to two different places in frame buffer

  45. GeForce3 - Multisampling • After rendering two copies of entire frame • Shift pixels of Sample #2 left and up by ½ pixel • Imagine laying Sample #2 (red) over Sample #1 (black)

  46. GeForce3 - Multisampling • Resolve the two samples into one image by computing average between each pixel from Sample 1 (black) and the four pixels from Sample 2 (red) that are 1/ sqrt(2) pixels away

  47. GeForce3 - Multisampling • No AA Multisampling

  48. GeForce3 - Multisampling • 4x Supersample Multisampling

  49. ATI Smoothvision • ATI SmoothVision • Programmer selects samping pattern • Some supersamplingthrown in

More Related