280 likes | 497 Views
Week 7 - Wednesday. CS361. Last time. What did we talk about last time? Specular shading Aliasing and antialiasing. Questions?. Project 2. Student Lecture: Transparency. Transparency. Transparency.
E N D
Week 7 - Wednesday CS361
Last time • What did we talk about last time? • Specular shading • Aliasing and antialiasing
Transparency • Partially transparent objects significantly increase the difficulty of rendering a scene • We will talk about really difficult effects like frosted glass or light bending later • Just rendering transparent objects at all is a huge pain because the Z-buffer doesn't work anymore • Workarounds: • Screen door transparency • Sorting • Depth peeling
Screen door transparency • We render an object with a checkerboard pattern of holes in it, leaving whatever is beneath the object showing through • Problems: • It really only works for 50% transparent objects • Only one overlapping transparent object really works • But it is simple and inexpensive
Over operator • Most transparency methods use the over operator, which combines two colors using the alpha of the one you're putting on top • c0 = αscs + (1 - αs)cd • cs is the new (source) color • cd is the old (destination) color • co is the resulting (over) color • αs is the opacity (alpha) of the object
Sorting • The over operator is order dependent • To render correctly we can do the following: • Render all the opaque objects • Sort the centroids of the transparent objects in distance from the viewer • Render the transparent objects in back to front order • To make sure that you don't draw on top of an opaque object, you test against the Z-buffer but don't update it
Problems with sorting • It is not always possible to sort polygons • They can interpenetrate • Hacks: • At the very least, use a Z-buffer test but not replacement • Turning off culling can help • Or render transparent polygons twice: once for each face
Depth peeling • It is possible to use two depth buffers to render transparency correctly • First render all the opaque objects updating the first depth buffer • Make second depth buffer maximally close • On the second (and future) rendering passes, render those fragments that are closer than the z values in the first depth buffer but further than the value in the second depth buffer • Update the second depth buffer • Repeat the process until no pixels are updated
Other alpha effects • Alpha values can be used for antialiasing, by lowering the opacity of edges that partially cover pixels • Additive blending is an alternative to the over operator • c0 = αscs + cd • This is only useful for effects like glows where the new color never makes the original darker • Unlike transparency, it can be applied in any order
Gamma • I don't want to go deeply into gamma • The trouble is that real light has a wide range of color values that we need to store in some limited range (such as 0 – 255) • Then, we have to display these values, moving back from the limited range to the "real world" range
Gamma correction • Physical computations should be performed in the linear (real) space • To convert that linear space into nonlinear frame buffer space, we have to raise values by a power, typically 0.45 for PCs and 0.55 for Macs • Each component of physical color (0.3, 0.5, 0.6) is raised to 0.45 giving (0.582, 0.732, 0.794) then scaled to the 0-255 range, giving (148, 187, 203)
Gamma errors • Usually, gamma correction is taken care of for you • If you are writing something where you need to do computations in the "real life" color space (such as a raytracer), you may have to worry about it • Calculations in the wrong space can have visually unrealistic effects
Texturing • We've got polygons, but they are all one color • At most, we could have different colors at each vertex • We want to "paint" a picture on the polygon • Because the surface is supposed to be colorful • To appear as if there is greater complexity than there is (a texture of bricks rather than a complex geometry of bricks) • To apply other effects to the surface such as changes in material or normal
Texture pipeline • Transformed value • We never get tired of pipelines • Go from object space to parameter space • Go from parameter space to texture space • Get the texture value • Transform the texture value
Projector function • The projector function goes from the model space (a 3D location on a surface) to a 2D (u,v) coordinate on a texture • Usually, this is based on a map from the model to the texture, made by an artist • Tools exist to help artists "unwrap" the model • Different kinds of mapping make this easier • In other scenarios, a mapping could be determined at run time
Corresponder function • From (u,v) coordinates we have to find a corresponding texture pixel (or texel) • Often this just maps directly from u,v [0,1] to a pixel in the full width, height range • But matrix transformations can be applied • Also, values outside of [0,1] can be given, with different choices of interpretation
Texture values • Usually the texture value is just an RGB triple (or an RGBα value) • But, it could be procedurally generated • It could be a bump mapping or other surface data • It might need some transformation after retrieval
Next time… • Image texturing techniques • Procedural texturing
Reminders • Keep working on Project 2 • Due this Friday, March 1 • Keep reading Chapter 6