1 / 52

CS-378: Game Technology

CS-378: Game Technology. Lecture #6: Interactive Systems Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica Hodgins for lecture notes 2005-01-1.1. Today. Project meetings Meetings Friday @ 11:00 Presentations.

mobryan
Download Presentation

CS-378: Game Technology

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS-378: Game Technology • Lecture #6: Interactive Systems • Prof. Okan Arikan • University of Texas, Austin • Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica Hodgins for lecture notes • 2005-01-1.1

  2. Today • Project meetings • Meetings Friday @ 11:00 • Presentations

  3. Interactive Entertainment World of Warcraft

  4. Interactive Programs Battlefield 2 • Games are interactive programs • Moreover, they are typically immersive in some way • What are the important features of an interactive program? • Which features are particularly important for immersive software like games?

  5. Important Features • User controls the action • Control is “direct” and “immediate” • Program provides constant feedback about its state • The user must know and understand what is happening • The user must receive acknowledgment that their input was received

  6. Immersive Software • Software engages the attention of the user • Response is really immediate • Software provides a consistent alternate reality • The program should do reasonable things, for some definition of reasonable (game designer defines reasonable)

  7. Interactive Program Structure • Event driven programming • Everything happens in response to an event • Events come from two sources: • The user • The system • Events are also called messages • An event causes a message to be sent… Initialize User Does Something And/or System does Something System Updates

  8. User Events • The OS manages user input • Interrupts at the hardware level … • Get converted into events in queues at the windowing level … • It is generally up to the application to make use of the event stream • User interface toolkits have a variety of methods for managing events • There are two ways to get events: You can ask, or you can be told.

  9. Polling for Events • Most windowing systems provide a non-blocking event query • Does not wait for an event, returns immediately if no events are ready • In FLTK, it’s wait(0) • What type of games might use this structure? • Why wouldn’t you always use it? while ( not done ) if ( e = checkEvent() ) process event … draw frame

  10. Waiting for Events • Most windowing systems provide a blocking event function • Waits (blocks) until an event is available • On what systems is this better than the previous method? • What types of games is it most useful for? while ( not done ) e = nextEvent(); process event … draw frame

  11. Background Work • Stuff often must happen even if the user isn’t doing anything • What stuff? • Why is this a problem with the “wait” technique? • How is it solved? • 2 answers

  12. System Events • Windowing systems provide timer events • The application requests an event at a future time • The system will provide an event sometime around the requested time. Semantics vary: • Guaranteed to come before the requested time • As soon as possible after • Almost never right on (real-time OS?) • Application is not interrupted - it has to look for the event • Exception: alarm signals in UNIX (setitimer)

  13. Work Procedures • A function to be called if no event is available • The blocking event query checks for an event; if none found, call this procedure • Typically hard to use – they must do bounded computation and there’s no guarantee they will ever be called

  14. The Callback Abstraction • A common event abstraction is the callback mechanism • Applications register functions they wish to have called in response to particular events • Translation table says which callbacks go with which events • Generally found in GUI (graphical user interface) toolkits • “When the button is pressed, invoke the callback” • Many systems mix methods, or have a catch-all callback for unclaimed events (FLTK works this way) • Why are callbacks good? Why are they bad?

  15. Upon Receiving an Event … • Event responses fall into two classes: • Task events: The event sparks a specific task or results in some change of state within the current mode • eg Load, Save, Pick up a weapon, turn on the lights, … • Call a function to do the job • Mode switches: The event causes the game to shift to some other mode of operation • eg Start game, quit, go to menu, … • Switch event loops, or change callbacks, because events now have different meanings • Software structure reflects this - menu system is separate from run-time game system, for example

  16. Real-Time Loop • At the core of games with animation is a real-time loop: • What else might you need to do? • The number of times this loop executes per second is the frame rate • # frames per second (fps) while ( true ) process events update animation render

  17. Lag • Lag is the time between when a user does something and when they see the result - also called latency • Too much lag and causality is distorted • With tight visual/motion coupling, too much lag makes people motion sick • Too much lag makes it hard to target objects (and track them, and do all sorts of other perceptual tasks) • High variance in lag also makes interaction difficult • Users can adjust to constant lag, but not variable lag • From a psychological perspective, lag is the important variable

  18. Computing Lag • Lag is NOT the time it takes to compute 1 frame! • What is the formula for maximum lag as a function of frame rate, fr? Process input Event time Frame time Update state Render Lag Process input Update state Render Process input

  19. Frame Rate Questions • What is an acceptable frame rate for twitch games? Why? • What is the maximum useful frame rate? Why? • What is the frame rate for NTSC television? • What is the minimum frame rate required for a sense of presence? How do we know? • How can we manipulate the frame rate?

  20. Frame Rate Answers (I) • Twitch games demand at least 30fs, but the higher the better (lower lag) • Users see enemy’s motions sooner • Higher frame rates make targeting easier • The maximum useful frame rate is the monitor refresh rate • Time taken for the monitor to draw one screen • But wait, there’s more…

  21. Frame Rate Restrictions • There are synchronization issues (RTR, Sect 2.1) • Buffer swap in graphics hardware is timed with vertical sweep, so ideal frame rate is monitor refresh rate • Can turn of synchronization, but get tearing artifacts on screen • Top half of screen is one image, bottom half is another • Synchronization limits available frame rates • Say your monitor refreshes at 60Hz • Available frame rates are 60, 30, 20, 15, 12, 10, … • Operating system clock limits timer resolution • Linux OS clock is 100Hz, Windows 2000 clock is 1000Hz (defined in CLK_TCK)

  22. Frame Rate Answers (II) • NTSC television draws all the odd lines of the screen, then all the even ones (interlace format) • Full screen takes 1/30th of a second • Use 60fps to improve visuals, but only half of each frame actually gets drawn by the screen • Do consoles only render 1/2 screen each time? • It was once argued that 10fps was required for a sense of presence (being there) • Head mounted displays require 20fps or higher to avoid illness • Many factors influence the sense of presence • Perceptual studies indicate what frame rates are acceptable

  23. NTSC vs. PAL • Two major differences: • Vertical resolution: 625 vs. 525 (not all useable) • Frame rate: 50 Hz vs. 60 Hz (approx) • Issues: • Artwork appearance, particularly for menus and other 2D art • Animation timing: Detach animation clock from frame rate clock, which is good practice anyway

  24. Why is this related ?

  25. HDTV • Station controlled resolution • 720p - 1280x720 pixels progressive • 1080i - 1920x1080 pixels interlaced • 1080p - 1920x1080 pixels progressive • Any bandwidth issues ? • More pixels, lots more…

  26. Reducing Lag • Faster algorithms and hardware is the obvious answer • Designers choose a frame rate and put as much into the game as they can without going below the threshold • Part of design documents presented to the publisher • Threshold assumes fastest hardware and all game features turned on • Options given to players to reduce game features and improve their frame rate • There’s a resource budget: How much time is dedicated to each aspect of the game (graphics, AI, sound, …) • Some other techniques allow for more features and less lag

  27. Decoupling Computation • It is most important to minimize lag between the user actions and their direct consequences • So the input/rendering loop must have low latency • Lag between actions and other consequences may be less severe • Time between input and the reaction of enemy can be greater • Time to switch animations can be greater • Technique: Update different parts of the game at different rates, which requires decoupling them • For example, run graphics at 60fps, AI at 10fps • Very common in real games

  28. Animation and Sound • Animation and sound need not be changed at high frequency, but they must be updated at high frequency • For example, switching from walk to run can happen at low frequency, but joint angles for walking must be updated at every frame • Solution is to package multiple frames of animation and submit them all at once to the renderer • Good idea anyway, makes animation independent of frame rate • Sound is offloaded to the sound card

  29. Texture Mapping • The problem: Colors, normals, etc. are only specified at vertices. How do we add detail between vertices? • Solution: Specify the details in an image (the texture) and specify how to apply the image to the geometry (the map) • Works for shading parameters other than color, as we shall see • The basic underlying idea is the mapping

  30. Basic Mapping • The texture lives in a 2D space • Parameterize points in the texture with 2 coordinates: (s,t) • These are just what we would call (x,y) if we were talking about an image, but we wish to avoid confusion with the world (x,y,z) • Define the mapping from (x,y,z) in world space to (s,t) in texture space • With polygons: • Specify (s,t) coordinates at vertices • Interpolate (s,t) for other points based on given vertices

  31. Basic Mapping

  32. I assume you recall… • Texture sampling (aliasing) is a big problem • Mipmaps and other filtering techniques are the solution • The texture value for points that map outside the texture image can be generated in various ways • Repeat, Clamp, … • Texture coordinates are specified at vertices and interpolated across triangles • Width and height of texture images is constrained (powers of two, sometimes must be square)

  33. Textures in Games • The game engine provides some amount of texture support • Artists are supplied with tools to exploit this support • They design the texture images • They specify how to apply the image to the object • Commonly, textures are supplied at varying resolutions to support different hardware performance • Note that the texture mapping code does not need to be changed - just load different sized maps at run time • Textures are, without doubt, the most important part of a game’s look

  34. Example Texture Tool

  35. Standard Pipeline

  36. Multitexturing Some effects are easier to implement if multiple textures can be applied Future lectures: Light maps, bump maps, shadows, …

  37. Multitexturing Multitexturing hardware provides a pipeline of texture units, each of which applies a standard texture map operation Fragments are passed through the pipeline with each step working on the result of the previous stage Texture parameters are specified independently for each unit, further improving functionality For example, the first stage applies a color map, the next modifies the illumination to simulate bumps, the third modifies opacity Not the same as multi-pass rendering - all applied in one pass

  38. Pixel Shaders • Current generation hardware provides pixel shaders • A pixel shader operates on a fragment • A single pixel (or sub-pixel) that has already been “lit” • It can compute texture coordinates, do general texture look-ups, modify color/depth/opacity, and some other functions • More general than multi-texturing and very powerful

  39. Packing Textures • Problem: The limits on texture width/height make it inefficient to store many textures • For example: long, thin objects • Solution: Artists pack the textures for many objects into one image • The texture coordinates for a given object may only index into a small part of the image • Care must be taken at the boundary between sub-images to achieve correct blending • Mipmapping is restricted • Best for objects that will be at known resolution (weapons, for instance)

  40. Combining Textures

  41. Texture Matrix • Normally, the texture coordinates given at vertices are interpolated and directly used to index the texture • The texture matrix applies a homogeneous transform to the texture coordinates before indexing the texture • What use is this?

  42. Animating Texture (method 1) • Loading a texture onto the graphics card is very expensive • But once there, the texture matrix can be used to “transform” the texture • For example, changing the translation can select different parts of the texture • If the texture matrix is changed from frame to frame, the texture will appear to move on the object • This is particularly useful for things like flame, or swirling vortices, or pulsing entrances, …

  43. Projective Texturing • The texture should appear to be projected onto the scene, as if from a slide projector • Solution: • Equate texture coordinates with world coordinates • Think about it from the projector’s point of view: wherever a world point appears in the projector’s view, it should pick up the texture • Use a texture matrix equivalent to the projection matrix for the projector – maps world points into texture image points • Details available in many places • Problems? What else could you do with it?

  44. What’s in a Texture? • The graphics hardware doesn’t know what is in a texture • It applies a set of operations using values it finds in the texture, the existing value of the fragment (pixel), and maybe another color • The programmer gets to decide what the operations are, within some set of choices provided by the hardware • Examples: • the texture may contain scalar “luminance” information, which simply multiplies the fragment color. What use is this? • the texture might contain “alpha” data that multiplies the fragment’s alpha channel but leaves the fragment color alone. What use is this? • Future lectures will look at creative interpretations of textures – it’s a burgeoning area

  45. Environment Mapping • Environment mapping produces reflections on shiny objects • Texture is transferred in the direction of the reflected ray from the environment map onto the object • Reflected ray: R=2(N·V)N-V • What is in the map? Environment Map Viewer Reflected ray Object

  46. Approximations Made • The map should contain a view of the world with the point of interest on the object as the eye • We can’t store a separate map for each point, so one map is used with the eye at the center of the object • Introduces distortions in the reflection, but the eye doesn’t notice • Distortions are minimized for a small object in a large room • The object will not reflect itself • The mapping can be computed at each pixel, or only at the vertices

  47. Environment Maps • The environment map may take one of several forms: • Cubic mapping • Spherical mapping (two variants) • Parabolic mapping • Describes the shape of the surface on which the map “resides” • Determines how the map is generated and how it is indexed • What are some of the issues in choosing the map?

  48. Example

  49. Cubic Mapping • The map resides on the surfaces of a cube around the object • Typically, align the faces of the cube with the coordinate axes • To generate the map: • For each face of the cube, render the world from the center of the object with the cube face as the image plane • Rendering can be arbitrarily complex (it’s off-line) • Or, take 6 photos of a real environment with a camera in the object’s position • Actually, take many more photos from different places the object might be • Warp them to approximate map for all intermediate points • Remember The Abyss and Terminator 2?

  50. Cubic Map Example

More Related