1 / 33

OpenGL Vertex Arrays

OpenGL Vertex Arrays.

Download Presentation

OpenGL Vertex Arrays

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. OpenGL Vertex Arrays OpenGL vertex arrays store vertex properties such as coordinates, normal vectors, color values and texture coordinates. These properties can be specified indirectly using their positions in the arrays, instead of calling glVertex*(), glNormal*(), glColor*() and glTexCoord*(). • Procedures to use vertex arrays: • Enable vertex arrays. • Specify data for the arrays. • Dereference the arrays and render primitives.

  2. Enable Vertex Arrays void glEnableClientState(GLenum array) Enable vertex array void glDisableClientState(GLenum array) Disable vertex array When using multitexture, glEnableClientState() and glDisableClientState() only affect current client texture unit specified by glClientActiveTexture() void glClientActiveTexture(GLenum texUnit) Select current client texture unit. texUnit is the texture unit identifier.

  3. Specify Data for the Arrays void glVertexPointer( GLint size, GLenum type, GLsizei stride, const GLvoid *pointer) Specify array with vertex coordinates size: Number of coordinates per vertex. Must be 2, 3 or 4. type: Data type for vertex coordinates. GL_SHORT, GL_INT, GL_FLOAT, GL_DOUBLE. stride: Byte offset between consecutive vertices. If stride = 0, the vertices are assumed to be tightly packed in the array. pointer: Pointer to the array that stores vertex coordinates. void glColorPointer( GLint size, GLenum type, GLsizei stride, const GLvoid *pointer) void glNormalPointer(GLenum type, GLsizei stride, const GLvoid *pointer) void glTexCoordPointer( GLint size, GLenum type, GLsizei stride, const GLvoid *pointer)

  4. When using multitexture, glTexCoordPointer() only affects current client texture unit specified by glClientActiveTexture()

  5. Dereference the Arrays and Render Primitives De-reference a Single Array Element void glArrayElement(GLint i) Specify vertex properties for the ith vertex for all enabled arrays. This command is usually called between glBegin( ) and glEnd( ). This is equivalent to: if (color array is enabled) glColor*v(cpointer + i * cstride); if (normal array is enabled) glNormal*v(npointer + i * nstride); if (texture coordinate array is enabled) glTexCoord*v(tpointer + i * tstride); if (vertex array is enabled) glVertex*v(vpointer + i * vstride);

  6. De-reference a List of Array Elements void glDrawElements( GLenum mode, GLsizei count, GLenum type, GLvoid *indices) Draw a sequence of primitives using array elements referenced by an index array. mode: Primitive type. Same as the argument in glBegin(). count: Number of elements to specify. type: Data type for index array indices. GL_UNSIGNED_BYTE, GL_UNSIGNED_SHORT, GL_UNSIGNED_INT. indices: Pointer to the index array. Equivalent to: glBegin (mode); for (i = 0; i < count; ++i) glArrayElement (indices[i]); glEnd();

  7. De-reference a Sequence of Array Elements void glDrawArrays(GLenum mode, GLint first, GLsizei count) Draw a sequence of primitives using array elements starting from first and ending at first + count – 1. mode: Primitive type. Same as the argument in glBegin(). first: Starting element index. count: Number of elements to specify. Equivalent to: glBegin (mode); for (i = 0; i < count; ++i) glArrayElement (first + i); glEnd();

  8. Example y 8 9 10 11 4 5 6 7 x 0 1 2 3

  9. struct VERTEX { float x[3]; // position float t[2]; // texture coordinates }; static VERTEX v[12] = { {0.0, 0.0, 0.0, 0.0, 0.0}, {5.0, 0.0, 0.0, 1.0, 0.0}, {10.0, 0.0, 0.0, 2.0, 0.0}, {15.0, 0.0, 0.0, 3.0, 0.0}, {0.0, 4.0, 0.0, 0.0, 0.5}, {5.0, 4.0, 0.0, 1.0, 0.5}, {10.0, 4.0, 0.0, 2.0, 0.5}, {15.0, 4.0, 0.0, 3.0, 0.5}, {0.0, 8.0, 0.0, 0.0, 1.0}, {5.0, 8.0, 0.0, 1.0, 1.0}, {10.0, 8.0, 0.0, 2.0, 1.0}, {15.0, 8.0, 0.0, 3.0, 1.0}}; glEnableClientState(GL_VERTEX_ARRAY); glEnableClientState(GL_TEXTURE_COORD_ARRAY); glVertexPointer(3, GL_FLOAT, 5*sizeof(VERTEX), v[0].x); glTexCoordPointer(2, GL_FLOAT, 5*sizeof(VERTEX), v[0].t);

  10. static unsigned int vindex[24]={ 0, 1, 5, 4, 1, 2, 6, 5, 2, 3, 7, 6, 4, 5, 9, 8, 5, 6, 10, 9, 6, 7, 11, 10}; glEnable(GL_TEXTURE_2D); glBindTexture(GL_TEXTURE_2D, tex_obj); glDrawElements(GL_QUADS, 24, GL_UNSIGNED_INT, vindex);

  11. Environment Mapping Environment mapping is used to render reflective surface. It assumes that the environment is much larger than the object and far away from the object. Viewer N R V Environment Map Object Surface N Unit Surface Normal V Unit Direction Vector to Viewer R Unit Reflection Vector

  12. Environment Mapping Procedure 1. Generate Environment Map Place a camera at the center of the object and take a panorama snapshot of the surrounding environment. Store the image as a texture. This texture is called Environment Map. 2. Obtain Reflection Vector 3. Generate Texture Coordinates for Sampling in the Environment Map Using Reflection Vector R

  13. Sphere Environment Mapping Sphere Environment Map Sphere Environment Mapped Object

  14. The surrounding environment is projected onto an enclosing sphere. The image on the sphere surface is then mapped into a circular area on a 2D texture. Unit Reflection Vector x R W z Texture Coordinates:

  15. Cube Environment Mapping Cube Environment Map Cube Environment Mapped Object

  16. The surrounding environment is projected onto an enclosing cube. The images on the six sides of the cube are then stored as six 2D textures.  y z x x z y x y z z x z x x z x z y y y y z y x

  17. R w = max (|Rx|, |Ry|, |Rz|); if (w == |Rx| and Rx > 0) {Choose +x face; s = Rz / w; t = Ry / w;} if (w == |Rx| and Rx < 0) {Choose x face; s = Rz / w; t = Ry / w;} if (w == |Ry| and Ry > 0) {Choose +y face; s = Rx / w; t = Rz / w;} if (w == |Ry| and Ry < 0) {Choose y face; s = Rx / w; t = Rz / w;} if (w == |Rz| and Rz > 0) {Choose +z face; s = Rx / w; t = Ry / w;} if (w == |Rz| and Rz < 0) {Choose z face; s = Rx / w; t = Ry / w;} s = 0.5(s + 1); t = 0.5(t + 1);

  18. Light Map Light map: Texture that stores the lighting intensities of a surface.  Scene with base textures Light Maps = Textured scene modulated by light maps For static scene under static lighting, light maps can be pre-computed.

  19. Light map was first introduced in Quake engine. Without Light Map With Light Map Light Map

  20. Bump Mapping + Texture Mapped Object Height Map = Bump Mapped Object

  21. Normal texture mapping creates a flat looking surface. Light Source Viewer Bump mapping applies a perturbation to surface normal and uses the perturbed normal for lighting calculation. Thus creates an illusion of surface bumpiness. Light Source Viewer

  22. Create Bump Map from Height Map Height Maph(x, y): A 2D texture that stores the height perturbation of a surface. Bump MapNb(x, y) :A 2D texture that stores perturbed normal vectors of a surface. Calculate perturbed normal: Use finite difference

  23. Normalization: Change component range from [1, 1] to [0, 1]: Store nx, ny, nz in R, G, B components of the bump map. To retrieve perturbed normal, do the inverse:

  24. Transform Vectors from Object Space to Texture Space Bump map stores perturbed surface normals in texture space. Thus transformation from object space to texture space is needed. Obtain base vectors for texture space N N Unit normal vector T Unit tangent vector B Unit binormal vector B T

  25. For curved surface with parametric equationP = P(u, v) Example: Sphere z N B T P r  y  x

  26. For triangle surface with texture coordinates specified for each vertex N3 B3 P3 T3 N1 B1 N2 P T1 P1 B2 P2 T2 For a point P with texture coordinates (s, t) on the triangle surface Replace P with P2 and P3:

  27. Solve the above equations to obtain T1 and B1: T1 and B1 are then normalized to unit vectors T2 , B2 , T3 , B3 can be obtained similarly. Transform vectors from object space to local texture space For direction to light vector L For direction to viewer vector V

  28. Lighting Calculation Use Nb, L' and V' for lighting calculation.

  29. Displacement Mapping Bump Mapping only perturbs surface in N direction and only uses perturbed normal for lighting calculations. Displacement Mapping perturbs surface in N, T, B directions. It not only uses perturbed normal for lighting calculations, but also displaces surface position for real.

  30. Parallax Occlusion Mapping

  31. Water Surface Animation and Rendering Plane Wave Equation Surface Point P = (x, y, z) A Amplitude f frequency Lx Wave length in x direction Ly Wave length in y direction Surface Normal

  32. Reflection Rendering z P0 Camera N N: Surface normal V R Water Surface xy plane P Direction to camera vector Reflection vector Use R to sample environment map to obtain reflection color Transparent Effect Use color blending

More Related