1 / 23

# CS361 - PowerPoint PPT Presentation

Week 2 - Friday. CS361. Last time. What did we talk about last time? Graphics rendering pipeline Geometry Stage. Questions?. Project 1. Assignment 1. Backface culling. I did not properly describe an important optimization done in the Geometry Stage: backface culling

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.

## PowerPoint Slideshow about 'CS361' - bunny

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

### CS361

• What did we talk about last time?

• Graphics rendering pipeline

• Geometry Stage

Backface culling

• I did not properly describe an important optimization done in the Geometry Stage: backface culling

• Backface culling removes all polygons that are not facing toward the screen

• A simple dot product is all that is needed

• This step is done in hardware in XNA and OpenGL

• You just have to turn it on

• Beware: If you screw up your normals, polygons could vanish

• For API design, practical top-down problem solving, and hardware design, and efficiency, rendering is described as a pipeline

• This pipeline contains three conceptual stages:

Rasterizer Stage

Rasterizer Stage

• The goal of the Rasterizer Stage is to take all the transformed geometric data and set colors for all the pixels in the screen space

• Doing so is called:

• Rasterization

• Scan Conversion

• Note that the word pixel is actually short for "picture element"

• As you should expect, the Rasterization Stage is also divided into a pipeline of several functional stages:

• Data for each triangle is computed

• This could include normals

• This is boring anyway because fixed-operation (non-customizable) hardware does all the work

• Each pixel whose center is overlapped by a triangle must have a fragment generated for the part of the triangle that overlaps the pixel

• The properties of this fragment are created by interpolating data from the vertices

• Again, boring, fixed-operation hardware does this

• This is where the magic happens

• Given the data from the other stages, per-pixel shading (coloring) happens here

• This stage is programmable, allowing for many different shading effects to be applied

• Perhaps the most important effect is texturing or texture mapping

• Texturing is gluing a (usually) 2D image onto a polygon

• To do so, we map texture coordinates onto polygon coordinates

• Pixels in a texture are called texels

• This is fully supported in hardware

• Multiple textures can be applied in some cases

• The final screen data containing the colors for each pixel is stored in the color buffer

• The merging stage is responsible for merging the colors from each of the fragments from the pixel shading stage into a final color for a pixel

• Deeply linked with merging is visibility: The final color of the pixel should be the one corresponding to a visible polygon (and not one behind it)

Z-buffer

• To deal with the question of visibility, most modern systems use a Z-buffer or depth buffer

• The Z-buffer keeps track of the z-values for each pixel on the screen

• As a fragment is rendered, its color is put into the color buffer only if its z value is closer than the current value in the z-buffer (which is then updated)

• This is called a depth test

Pros and cons of the Z-buffer

• Pros

• Polygons can usually be rendered in any order

• Universal hardware support is available

• Cons

• Partially transparent objects must be rendered in back to front order (painter's algorithm)

• Completely transparent values can mess up the z buffer unless they are checked

• z-fighting can occur when two polygons have the same (or nearly the same) z values

• A stencil buffer can be used to record a rendered polygon

• This stores the part of the screen covered by the polygon and can be used for special effects

• Frame buffer is a general term for the set of all buffers

• Different images can be rendered to an accumulation buffer and then averaged together to achieve special effects like blurring or antialiasing

• A back buffer allows us to render off screen to avoid popping and tearing

• This pipeline is focused on interactive graphics

• Micropolygon pipelines are usually used for film production

• Predictive rendering applications usually use ray tracing renderers

• The old model was the fixed-function pipeline which gave little control over the application of shading functions

• The book focuses on programmable GPUs which allow all kinds of tricks to be done in hardware

• GPU architecture