by michael smith n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
By: Michael Smith PowerPoint Presentation
Download Presentation
By: Michael Smith

Loading in 2 Seconds...

play fullscreen
1 / 88

By: Michael Smith - PowerPoint PPT Presentation


  • 97 Views
  • Uploaded on

By: Michael Smith. Sandstorm: A Dynamic Multi-contextual GPU-based Particle System, that uses Vector Fields for Particle Propagation. acknowledgments. Overview. Introduction Background Idea Software Engineering Prototype Results Conclusions and Future Work. Introduction.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'By: Michael Smith' - alan-bradley


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
by michael smith
By: Michael SmithSandstorm: A Dynamic Multi-contextual GPU-based Particle System, that uses Vector Fields for Particle Propagation
overview
Overview
  • Introduction
  • Background
  • Idea
  • Software Engineering
  • Prototype
  • Results
  • Conclusions and Future Work
introduction
Introduction
  • The use of Virtual Reality(VR) to visualize scientific phenomenon, is quite common.
  • VR can allow a scientists to immerse themselves in the phenomenon that they are studying.
introduction1
Introduction
  • Such phenomenon, such as dust clouds or smoke, would need a particle system to visualizes such fuzzy systems.
  • Vector fields can be used to 'guide' particles according to real scientific data.
  • Not a new idea, Vector Fields by Hilton and Egbert, c 1994.
introduction2
Introduction
  • VR applications and simulations require a multi-context environment.
  • A main context, controls and updates multiple rendering contexts.
  • This multi-contextual environment can cause problems with particle systems.
introduction3
Introduction
  • GPU offloading techniques have been proven to allow applications and simulations to offload work to the graphics hardware.
  • This can allow for acceleration of non-traditional graphics calculations.
  • GPU offloading can be used to accelerate particle calculations.
introduction4
Introduction
  • Sandstorm
    • Dynamic
    • Multi-contextual
    • GPU-based
    • Particle System
    • Using Vector Fields for Particle Propagation
background
Background
  • Helicopter and Dust Simulation(Heli-Dust), is a scientific simulation in which the effect of a helicopter's downdraft on the surrounding desert terrain.
  • Written using the Dust Framework, a framework which allows the developer to setup a scene using an XML file
background2
Background
  • Early prototypes for Heli-Dust, implemented a very simple particle system.
  • This particle system did not have a way to guide particles, according to observed scientific data.
background3
Background
  • Virtual Reality, is a technology which allows a user to interact with a computer-simulated environment, be it a real or imagined one.
  • Immerses the user in an environment.
background4
Background
  • Depth Cues, is an indicator in which a human can perceive information regarding depth.
  • They come in many shapes and sizes.
    • Monoscopic
    • Stereoscopic
    • Motion
background5
Background
  • Monoscopic depth cue
    • Information from a single eye, or image is available.
    • Information can include:
      • Position
      • Size
      • Brightness
background6
Background
  • Stereoscopic depth cue:
    • Information from two eyes.
    • This information is derived from the parallax between the different images received by each eye.
    • Parallax, is the apparent displacement of objects viewed from different locations.
background7
Background
  • Motion depth cue
    • Motion parallax
    • The changing relative position between the head and the object being observed.
    • Objects in the distance move less than objects closer to the viewer.
background8
Background
  • Stereoscopic Displays, 'trick' the user's eyes into thinking there is depth where no depth exists.
  • Come in all shapes and sizes.
background11
Background
  • Multiple Contexts
    • A main context which controls multiple rendering contexts.
    • Because of these multiple context, a Virutal Reality application developer needs to make sure that all context sensitive information and algorithms are multiple context safe.
background12
Background
  • There are many Virtual Reality toolkits and libraries.
  • Such toolkits and libraries handle things such as:
    • Generating Stereoscopic Images
    • Setting up the VR environment
    • And some handle distribution methods.
background13
Background
  • Virtual Reality User Interface, or VRUI, is a virtual reality development toolkit.
  • Developed by Oliver Kreylos at UC Davis.
  • VRUI's main mission statement is to shield the developer from a particular configuration of a VR system.
background14
Background
  • Tries to accomplish the mission by the abstraction of three main areas
    • Display abstraction
    • Distribution abstraction
    • Input abstraction
  • Another feature of VRUI is its built in menu systems.
background16
Background
  • FreeVR, developed and maintained by William Sherman.
  • Open-source virtual reality interface/intergration library.
  • FreeVR was designed to work on a diverse range of input and output hardware.
  • FreeVR currently is designed to work on shared memory systems.
background17
Background
  • In 1983, William T. Reeves wrote, Particle Systems – A Technique for Modeling a Class of Fuzzy Objects.
  • This paper introduces the particle system, a modeling method that models an object as a cloud of primitives particles that define its volume.
background18
Background
  • Reeves categories particle systems as “fuzzy” objects, in which they do not have smooth, well-defined, and shiny surfaces.
  • Instead their surfaces are irregular, complex, and ill defined.
  • This particle system was used to create the Genesis Effect, for the movie Star Trek II: The Wrath of Khan.
background21
Background
  • Reeves described, in his paper, a particle system that had five steps.
    • Particle Generation
    • Particle Attributes Assignment
    • Particle Dynamics
    • Particle Extinction
    • Particle Rendering
background22
Background
  • Particle Generation
    • First the number of particles to be generated per time interval is calculated.
    • Then the particles are generated.
background23
Background
  • Particle Attributes Assignment, whenever a particle is created, the particle system must determine values for the following attributes:
    • Initial position and velocity
    • Initial size, color and transparency
    • And initial shape and lifetime.
  • Initial position of the particles is determined by a generation shape.
background25
Background
  • Particle Dynamics, once all the particles have been created and assign initial attributes, the positions and or velocities are updated.
  • Particle Extinction, once a particle has live past its predetermined lifetime, measured in frames, the particle dies.
background26
Background
  • Particle Rendering, once the position and appearance of the particles where determined the particles are rendered.
  • Two assumption where made
    • Particles do not intersect with other surface-based objects.
    • Particles where considered point light sources.
background27
Background
  • In recent years, graphics vendors have replaced areas of fixed functionality with areas of programmability.
  • Two such areas are the Vertex and Fragment Processors.
background28
Background
  • Vertex Processor, is a programmable unit that operates on incoming vertex values.
  • Some duties of the vertex processor are:
    • Vertex transformation
    • Normal transformation and normalization
    • Texture coordinate generation and transformation.
background29
Background
  • Fragment Processor, is a programmable unit that operates on incoming fragment values.
  • Some duties of the fragment processor are:
    • Operations on interpolated values.
    • Texture access.
    • Texture application.
    • Fog
background30
Background
  • While a program, shader, is running on one of these processors, the fixed functionality is disable.
  • Several programming languages where created to aid in the development of shaders, one such langauge is OpenGL Shading Language(GLSL).
background32
Background
  • Vertex and Fragment shaders can't create vertices, only work on data past to them.
  • Geometry shaders can create any number vertices.
  • Can allow shaders to create geometry without having to be told to by the CPU.
background33
Background
  • Transform Feedback, allows a shader to specify the output buffer.
  • The target output buffer can be the input buffer of another shader.
  • Allows developers to create multi-pass shaders that do not relay information back to the CPU for the other passes.
background34
Background
  • ParticleGS, is a Geometry Shader based particle system, that does the following:
    • Stores particle information in Vertex Buffer Objects.
    • Uses a Geometry shader to create particles, and store them as vertex information in VBOs.
    • Uses Transform Feedback, to send particle data in between shaders.
    • Uses a Geometry shader to create billboards and point sprites to render particles.
background35
Background
  • In the days before shaders, the GPU was used just for rendering.
  • But with the advent of shaders, GPU's can now be used to aid scientific computation.
  • One can 'trick' the GPU into thinking that it is working on rendering information
background36
Background
  • Uber Flow, is a system for real-time animation and rendering of large particle sets using GPU computation.
  • Million Particle System, a GPU-based particle system that can render a large set of particles
background37
Background
  • Both particle systems doing the following
    • Store particle information to textures.
    • Use a series of vertex and fragment shaders to update the particle information.
    • Use the CPU to create and send rendering information.
    • And use a series of vertex and fragment shaders to render the information from CPU.
slide48
Idea
  • Sandstorm
    • Dynamic
    • Multi-contextual
    • GPU-based
    • Particle System
    • That uses Vector Fields for Particle Propagation.
slide49
Idea
  • Dynamic, Sandstorm should have the ability to change certain attributes on the fly.
    • Rate of emission
    • Size of particles
    • Lifetime of particles
slide50
Idea
  • Multi-contextual, as previously stated 3D VR environments uses multiple contexts.
  • Thus Sandstorm must be designed to handle these multiple contexts.
    • Random number generation
    • Between screen consistency
slide51
Idea
  • GPU-based, Sandstorm will be designed to leverage the uses of todays most powerful and advanced GPUs.
  • Use Geometry shaders to create, update, and render particles.
  • Use Transform Feedback to direct data between shaders.
slide52
Idea
  • Vector Fields, in order to 'guide' particles according to observed scientific data, vector fields will be used in Sandstorm.
  • But, Sandstorm should not be a vector field simulator, only take vector fields.
prototype
Prototype
  • GPU-Based Particle System, like most particle systems, Sandstorm has three main phases:
    • Creation and Destruction
    • Update
    • Rendering
prototype1
Prototype
  • Creating and Destroying Particles, Traditional particle systems would create a particle and store it into a dynamically growing data structure.
  • But, GPU-based particle systems store the particles in a texture.
prototype2
Prototype
  • Textures need to describe a rectangular area that encompasses the entire area of the data.
  • For example if we had 19 members, the texture would need to cover an area of 20.
  • Not so with VBO's
  • VBO's can fit the exact amount of data that is to be used.
prototype3
Prototype
  • Like Million Particle System and Uber Flow, Sandstorm stores its particle information in a double buffered approach.
  • Each frame one of the buffers is used as the read buffer and the other as a write buffer.
  • Each buffer holds two VBO's, one for the particle position the other for the velocities
prototype4
Prototype
  • Geometry shaders can emit one or more vertices.
  • At the beginning of the Creation/Destruction phase, the read buffer is passed to the shader.
  • The shader then determines if its dealing with an emitter.
prototype5
Prototype
  • If the shader is dealing with an emitter the following happens.
    • How many particles are to be generated is determined.
    • The initial information for the particles is determined.
prototype6
Prototype
  • Determining amount of particles to be emitted
    • How many particles have already been emitted, a
    • Subtract a from the about of particles that is to be emitted per second, p
    • Divide p by the amount of time left in that cycle, each cycle is a second.
prototype7
Prototype
  • Determining the initial information of particles
    • A random information texture is used.
    • The texture is Translated, Scaled, and Rotated, by a random amount, then sent to the shader.
    • Emitters, have random numbers in the velocity information, that is used to do texture look ups.
    • The use of the texture is to make sure the particles are consistent between contexts
prototype8
Prototype
  • If the Creation/Destruction shader is passed a particle something different happens.
  • First the particle is determined to be alive or dead.
    • If alive, the particle is re-emitted into the buffer.
    • If dead, the a blank particle is emitted into the buffer.
prototype9
Prototype
  • Updating Particles, once new particles are created and old ones destroyed, the particles are updated:
    • The delta time between frames is passed to the update shader.
    • A lookup in the vector field, 3D texture, is done according to the particles position.
    • Vector field velocity is added to particles velocity, and then the particles position is updated.
prototype10
Prototype
  • Once the particles have been updated, the particles are then rendered.
  • The particle positions are passed to the render shader, using Transform Feedback.
  • Particles are rendered as either
    • Textured deferred shaded billboards
    • Or, points.
prototype11
Prototype
  • Particle position represent the center of the particle.
  • So, four points have to be determined to create a billboard.
  • Information that is already known, the center of the particle and the vector pointing to the eye.
prototype13
Prototype
  • Once the vectors are found, they can be added to the particles position to get the four points.
prototype14
Prototype
  • Once the points are found, they can be emitted to create the billboard.
  • Once the billboard has been emitted, a texture is applied.
  • Once the result of billboard shader is determined, a deferred shading method can be applied.
prototype15
Prototype
  • Currently Sandstorm uses a deferred shading method to blend the particles together.
  • First step is to accumulate the particles, per pixel, so that the more particles that are behind a particle pixel the denser it looks.
  • Once that is done the result can be rendered to a full screen quad with the result textured onto it.
prototype16
Prototype.
  • There is a catch though.
  • Using this method requires the user of Sandstorm to:
    • Render the scene into an off-screen buffer, with a depth buffer attached.
    • Give the deferred shader the depth buffer, so that it can blend with the scene, obscuring any solid object and also being obscured by solid objects.
prototype17
Prototype
  • Vector Fields, in Sandstorm are represented as 3D textures.
  • A texture was used instead of a VBO, because of existing internal methods for dealing with 3D textures, such as wrapping, indexing, and interpolating.
prototype18
Prototype
  • When a lookup is done on the vector field, to get information for a particle particle the following happens:
    • The position of the particle is interpolated
    • Dividing the members by the width, height, and depth of the vector field.
prototype19
Prototype
  • Like previously stated, when dealing with a multi-contextual environment, one must be care to make context sensitive data and algorithms conform to the multi-contextual environment.
  • This is handle in Sandstorm, by having a controlling class, which is stored on the main context of the VR environment, control multiple update/render classes.
dynamic
Dynamic
  • Sandstorm, has the ability to change some of its attributes, both at run-time and compile-time.
results
Results
  • A sample application was created to test out Sandstorm.
  • The Heli-Dust application was used as a test bed.
  • A basic vector field was used in the sample application
results1
Results
  • Considering that Sandstorm is not a vector field simulator, a simple helicopter interaction model was made, as the helicopter throttle increase:
    • The rate of emission was increased.
    • The lifetime of the particle was increased.
    • And the maximum amount of particles was increased.
results2
Results
  • The sample application was run on the following system, that powered a four side CAVE environment.
    • A multi-cored shared memory machine, with four quad-core chips.
    • 48 Gbs of RAM
    • an Nvidia Quadroplex
    • Running Ubuntu 7.10 Linux
results3
Results
  • Rendered 300,000 deferred shaded particles at:
    • 15-20 FPS while standing in the particle system
    • ~65 FPS while standing a good distance back from the particle system.
results4
Results
  • Show movies.
conclusion and future work
Conclusion and Future Work
  • Vector fields can be used to 'guide' particles.
  • Sandstorm can run in a multi-contextual environment.
  • Sandstorm utilizes the latest in GPU off-loading techniques
  • Sandstorm can render more than 100,000 particles at above 15 FPS.
conclusion and future work1
Conclusion and Future Work
  • Opitmizations:
    • Currently both emitters and particles reside in the same buffers, separating them can limit branching in shaders.
    • Currently buffer sizes are static, allowing them to grow and shrink can increase speed of updating and rendering.
conclusion and future work2
Conclusion and Future Work
  • Other improvements
    • Collisions between the particles and objects in the scene.
    • Soft Particles, Motion Blur, and Light Scattering could be used to give the particles more realism
    • A shader based physics model could be implemented to allow user to change the behavior of the particles
conclusion and future work3
Conclusion and Future Work
  • Other work
    • A vector field simulator could be create to feed Sandstorm dynamically changing vector fields, so that particle motion acts more naturally.
    • A vector field creator/editor can be create to help scientist visualize vector fields before they are used in Sandstorm.