1 / 49

Modern Real-Time Rendering

Modern Real-Time Rendering. The Z-buffer algorithm and geometric primitives. Vector Graphics. Drawing with lines and curves only – no surfaces Used today in PostScript, text, and some output devices. Raster Graphics.

yanka
Download Presentation

Modern Real-Time Rendering

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Modern Real-Time Rendering The Z-buffer algorithm and geometric primitives

  2. Vector Graphics • Drawing with lines and curves only – no surfaces • Used today in PostScript, text, and some output devices

  3. Raster Graphics • Creating images for a rectangular array of pixels -- virtually all modern displays • Usually RGB colorspace used: say 8 bits for each of red, green, blue • Hardware-accelerated graphics emphasized populating the raster with sensible values as quickly as possible

  4. The Synthetic World • Rendering: converting a description of a scene into an image of the scene • Typically, scene descriptions are geometry • Explicit geometry: list of points (vertices) about which some information is known

  5. Geometric Primitives • Different ways of assembling vertices • isolated vertices (points) • sequences of points (lines, piecewise linear curves) • triangles • collections of triangles

  6. Assembling Triangles • Triangle fan • Triangle Strip • fewer than 3 vertices per triangle • saves memory, bus usage

  7. Higher-order Surfaces

  8. Point Surfaces

  9. Constructive Solid Geometry

  10. Implicit Surfaces

  11. Triangle Meshes • Hardware support for real-time rendering • Rasterization • which pixels are needed to show the object? • Visibility • which object can be seen?

  12. Z-buffer: Basic Idea • Project objects onto screen screen eye position

  13. Z-buffer: Basic Idea • Project every vertex onto screen • Pixels receive appropriate colors screen eye position

  14. "King Vertex" • The vertex is the fundamental primitive in modern real-time rendering • All information stored in vertex • position • color • texture coordinates • surface normal (direction perpendicular to surface) • possibly other attributes, in custom vertex format

  15. Hidden Surface Removal • Fundamental rendering problem: • Have a collection of geometry • Need to know what is visible (closest to the eye) at a given point on the screen • Don’t draw things that are behind other things • Historically, devolved to sorting

  16. How to Draw • "Painter's Algorithm" • Sort your objects in order of decreasing distance to the eye • Paint the most distant ones first, the closest ones last • Paint over the images of the distant objects with the closer objects that are in front of them

  17. The Z-buffer • Sorting is an enormous burden • The Z-buffer uses dedicated memory to free us from that problem (mostly) • Depth buffer: stores z value at every pixel • Depth test: only draw a fragment if it is closer than the last drawn fragment • Now, objects can be drawn in any order

  18. Z-buffer: Basic Idea • With each pixel, store a depth (Z) value • Specialized buffer available • Initialize z-buffer values to infty • For each fragment: Draw it iff it has a lower value than the previous value (hence is closer) • Update the depth buffer • Brute force solution to visibility

  19. Complications • Lack of resolution in depth buffer results in “z-fighting” between close values • Transparent objects need to be drawn last, and multiple transparent objects still demand sorting • Few transparent objects • Maybe, don’t care if we accurately show transparent objects behind other transparent objects

  20. Vertex Shading • Lighting calculations done on each vertex to determine color • Custom vertex shader executed, or standard one : BasicEffect in XNA • Historically, final colors computed by vertex shader, later interpolated across pixels • "Three term lighting model" • Gouraud shading

  21. Transformations • Critical task of vertex shader: compute final position of every vertex • Each vertex in the geometry receives appropriate transformation • Same transformation on each vertex • Modeling transform: Moving, orienting, and scaling the objects to create the scene • Viewing transform: Change of coordinate systems from whatever world coordinates into canonical coordinates

  22. Interpolation • Values from vertices interpolated to find values of fragment • Color interpolated (RGBA) • Texture coordinates interpolated • Texture lookup produces per-pixel color • Custom pixel shader executes at this step, potentially taking additional values from vertices and computing final color

  23. Rasterization • Primitives converted into pixels in the raster (grid) • Fragment values combined into pixel values (might have multiple fragments per pixel) • Depth test applied here

  24. Real-time rendering • Virtually all modern real-time computer graphics done with z-buffering • Hardware executes operations in parallel to accelerate image synthesis • Strict limitations on what can be done • “why do all video games look the same?” • Changing because of access to shaders

  25. But what does it all mean? • Now we will look at how to make use of some of this information in practice • Future lectures: • Writing custom vertex and pixel shaders • Applying and combining transformations • Using transformations to control the camera • moving the camera around in a scene, like in a FPS • For now: • putting geometry in the world • rendering with static camera and BasicEffect shader

  26. The Zbuffer and XNA • XNA runs the Z-buffer algorithm for you • enable depth testing, and nearer objects will be drawn in front of further objects • Fixed bit depth of Z-buffer is an issue • floating point Z means less resolution at larger distances • Reminder: pixel shader can modify depth values • can obtain interesting special effects by adjusting depth

  27. Abstracted Clouds • Fein and McGuire, NPAR 2006 • Partial silhouettes by adjusting depth values

  28. Setting up Geometry • To render geometry, execute the following steps: • create your vertices and set their properties • create a vertex declaration for the graphics device • create and configure an Effect • establish your camera parameters • in Draw, use the Effect to draw your vertices

  29. Vertex Types • Various builtin vertex types provided • different combinations of what information stored • position • color • texture coordinates • surface normal • VertexPositionColor • VertexPositionColorTexture • VertexPositionNormalTexture (**) • VertexPositionTexture

  30. Vertex Array • Probably you will want to make an array containing your vertex data VertexPositionColor[] mydata = new VertexPositionColor[6]; ... mydata[0] = new VertexPositionColor( new Vector3(1, 3, -1), Color.Aquamarine);

  31. Vertex Declaration • The Graphics Device has to be informed what kind of data it will receive • Done through a VertexDeclaration object vd = new VertexDeclaration(graphics.GraphicsDevice, VertexPositionColor.VertexElements); ... graphics.GraphicsDevice.VertexDeclaration = vd;

  32. the BasicEffect • Built-in vertex shader • The BasicEffect can do lighting • cleverly designed with a 3-light rig • key light: main light (often overhead) • fill light (somewhat dimmer, reduces shadows) • back light (behind object, illuminates silhouettes) • Or, you can disable lighting and just use the raw color • "the Basic Effect is not so basic" • Critical job of any vertex shader – (what?)

  33. Final Position Calculation • Calculation of screen position from vertex position done with matrix multiplication • we'll look at this in some detail in later • Done with three matrices: • world matrix: computes true world coordinates • just set to identity for now • view matrix: transforms world to "canonical" coordinates relative to camera • projection matrix: transforms 3D canonical coordinates to 2D screen coordinates

  34. Creating the View • Viewing transformation matrix Matrix view; ... Matrix.CreateLookAt(eyepos, lookat, up, out view);

  35. Creating the View • Viewing transformation matrix Matrix view; ... Matrix.CreateLookAt(eyepos, lookat, up, out view); position of camera position looked at "up" direction output – view matrix

  36. Creating the Projection • Projection transformation matrix Matrix projection; ... Matrix.CreatePerspectiveFieldOfView( fov, aspect, near, far, out projection);

  37. Creating the Projection • Projection transformation matrix Matrix projection; ... Matrix.CreatePerspectiveFieldOfView( fov, aspect, near, far, out projection); projection matrix far clipping plane aspect ratio Field of View (radians) near clipping plane

  38. Projection Information • Need to define a frustum (truncated pyramid) • Different ways of describing • always need near & far distances • In any API: function to get projection matrix given frustum description

  39. Matrices to BasicEffect effect = new BasicEffect(graphics.GraphicsDevice, null); ... effect.View = view; effect.Model = model; effect.Projection = projection;

  40. Using BasicEffect • An Effect contains one or more Techniques • A Technique contains one or more Passes effect.Begin(); foreach (EffectPass pass in effect.CurrentTechnique.Passes) { pass.Begin(); ... // drawing geometry here pass.End(); } effect.End();

  41. Rendering your Geometry • Various ways to specify • Arguably simplest: graphics.GraphicsDevice.DrawUserPrimitives( PrimitiveType.TriangleStrip, mydata, start, numprimitives);

  42. Rendering your Geometry • Various ways to specify • Arguably simplest: graphics.GraphicsDevice.DrawUserPrimitives( PrimitiveType.TriangleStrip, mydata, start, numprimitives); vertex array first element number of primitives

  43. The Final Image

  44. Recap • Z-buffer: algorithm for real-time rendering • vertices projected onto screen • vertices contain data: position, color, ... • intermediate fragments interpolated • depth test used to render fragments in front • Lot of setup needed in XNA to render • Vertex data and VertexDeclaration • "Effects" to transform and light vertices • transforms achieved through matrices • Drawing syntax

  45. Looking Forward • Custom shaders • Texture for added visual complexity • Closer look at transforms • mathematics of transforms • homogeneous coordinates • composite transforms • modeling transforms, camera control

More Related