1 / 27

Toward Relativistic Hydrodynamics on Adaptive Meshes

Toward Relativistic Hydrodynamics on Adaptive Meshes. Joel E. Tohline Louisiana State University http://www.phys.lsu.edu/~tohline. Principal Collaborators. Simulations to be shown today: Shangli Ou – (LSU) Mario D’Souza – (LSU) Michele Vallisneri – (Caltech/JPL)

neci
Download Presentation

Toward Relativistic Hydrodynamics on Adaptive Meshes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Toward Relativistic Hydrodynamics on Adaptive Meshes Joel E. Tohline Louisiana State University http://www.phys.lsu.edu/~tohline LSU 2004 Cactus Retreat

  2. Principal Collaborators • Simulations to be shown today: • Shangli Ou – (LSU) • Mario D’Souza – (LSU) • Michele Vallisneri – (Caltech/JPL) • Code development over the years: • John Woodward – (Valtech; Dallas, Texas) • John Cazes – (Stennis Space Center; Stennis, Mississippi) • Patrick Motl – (Colorado) • Science: • Juhan Frank (LSU) • Lee Lindblom (Caltech) • Luis Lehner (LSU) • Jorge Pullin (LSU) LSU 2004 Cactus Retreat

  3. Show 3 Movies • Nonlinear development of the r-mode in young neutron stars [w/ Lindblom & Vallisneri] http://www.cacr.caltech.edu/projects/hydrligo/rmode.html • Nonlinear development of the secular bar-mode instability in rapidly rotating neutron stars [w/ Ou & Lindblom] http://paris.phys.lsu.edu/~ou/movie/fmode/new/fmode.b181.om4.2e5.mov • Mass-transferring binary star systems [w/ D’Souza, Motl, & Frank] http://paris.phys.lsu.edu/~mario/models/q_0.409_no_drag_3.8orbs/movies/q_0.409_no_drag_3.8orbs_top.mov LSU 2004 Cactus Retreat

  4. Storyline • Present Algorithm – has been producing publishable astrophysical results for over 20 years: • Entirely home-grown code outside of Cactus environment • Manual domain decomposition • Explicit message-passing using mpi • Visualizations on serial machines (generally, post-processing) • Plans for this calendar year: • Move present algorithm into Cactus environment • Over the next few years, modify algorithm to: • Follow relativistic hydrodynamical flows on adaptive mesh • Accept evolving space-time metric • Visualize results “in parallel” with dynamical evolution LSU 2004 Cactus Retreat

  5. Storyline • Present Algorithm – has been producing publishable astrophysical results for over 20 years: • Entirely home-grown code outside of Cactus environment • Manual domain decomposition • Explicit message-passing using mpi • Visualizations on serial machines (generally, post-processing) • Plans for this calendar year: • Move present algorithm into Cactus environment • Over the next few years, modify algorithm to: • Follow relativistic hydrodynamical flows on adaptive mesh • Accept evolving space-time metric • Visualize results “in parallel” with dynamical evolution LSU 2004 Cactus Retreat

  6. Storyline • Present Algorithm – has been producing publishable astrophysical results for over 20 years: • Entirely home-grown code outside of Cactus environment • Manual domain decomposition • Explicit message-passing using mpi • Visualizations on serial machines (generally, post-processing) • Plans for this calendar year: • Move present algorithm into Cactus environment • Over the next few years, modify algorithm to: • Follow relativistic hydrodynamical flows on adaptive mesh • Accept evolving space-time metric • Visualize results “in parallel” with dynamical evolution LSU 2004 Cactus Retreat

  7. Present Algorithm • Select grid structure and resolution • Construct initial configuration • Perform domain decomposition • While t < tstop • Determine Newtonian gravitational accelerations • Push fluid around on the grid using Newtonian dynamics • If mod[ t , (orbital period/80) ] = 0 • Dump 3-D dataset for later visualization • EndIf • EndWhile • Visualize results LSU 2004 Cactus Retreat

  8. Principal Governing Equations LSU 2004 Cactus Retreat

  9. Principal Governing Equations LSU 2004 Cactus Retreat

  10. Present Algorithm • Select grid structure and resolution • Construct initial configuration • Perform domain decomposition • While t < tstop • Determine Newtonian gravitational accelerations • Push fluid around on the grid using Newtonian dynamics • If mod[ t , (orbital period/80) ] = 0 • Dump 3-D dataset for later visualization • EndIf • EndWhile • Visualize results LSU 2004 Cactus Retreat

  11. Present Algorithm • Select grid structure and resolution • Construct initial configuration • Perform domain decomposition • While t < tstop • Determine Newtonian gravitational accelerations • Push fluid around on the grid using Newtonian dynamics • If mod[ t , (orbital period/80) ] = 0 • Dump 3-D dataset for later visualization • EndIf • EndWhile • Visualize results LSU 2004 Cactus Retreat

  12. Present Algorithm • Select grid structure and resolution • Construct initial configuration • Perform domain decomposition • While t < tstop • Determine Newtonian gravitational accelerations • Push fluid around on the grid using Newtonian dynamics • If mod[ t , (orbital period/80) ] = 0 • Dump 3-D dataset for later visualization • EndIf • EndWhile • Visualize results Serial Serial LSU 2004 Cactus Retreat

  13. Present Algorithm • Select grid structure and resolution • Construct initial configuration • Perform domain decomposition • While t < tstop • Determine Newtonian gravitational accelerations • Push fluid around on the grid using Newtonian dynamics • If mod[ t , (orbital period/80) ] = 0 • Dump 3-D dataset for later visualization • EndIf • EndWhile • Visualize results Parallel LSU 2004 Cactus Retreat

  14. John Woodward: 8,192-processor MasPar @ LSU John Cazes: CM5 @ NCSA; T3D/E @ SDSC Patrick Motl: mpi on T3E @ SDSC; SP2/3 @ SDSC & LSU Michele Vallisneri: HP Exemplar @ CACR Mario D’Souza & Shangli Ou: SuperMike (1,024-proc Linux cluster) @ LSU Shangli Ou: Tungsten (2,560-proc Linux cluster) @ NCSA Early 90’s Mid-90’s Late 90’s 2000 2002-03 2004 Parallel Code’s Chronological Evolution LSU 2004 Cactus Retreat

  15. Select Grid Structure and Resolution • Unigrid, cylindrical mesh • Fixed in time • Typical resolution • Single star: 66 x 128 x 130 (as shown on the left) • Binary system: 192 x 256 x 98 LSU 2004 Cactus Retreat

  16. Select Grid Structure and Resolution LSU 2004 Cactus Retreat

  17. Need for Non-unigrid and Adaptive Meshes LSU 2004 Cactus Retreat

  18. Perform Domain Decomposition • Grid resolution 192 x 256 x 96 • 64 processors • Distribute 192 x 96 (R,Z) grid across 8 x 8 processor array • Leave angular zones “stacked” in memory • Result: Each processor has data arrays of size 24 x 256 x 12 • I/O: Scramble and unscramble handled manually Z R LSU 2004 Cactus Retreat

  19. Determine Newtonian Gravitational Accelerations(Three-dimensional Elliptic PDE on cylindrical mesh) LSU 2004 Cactus Retreat

  20. Principal Governing Equations LSU 2004 Cactus Retreat

  21. Determine Newtonian Gravitational Accelerations(Three-dimensional Elliptic PDE on cylindrical mesh) • Perform FFT (in memory) in azimuthal coordinate direction  reduce to decoupled set of (256) two-dimensional Helmholtz equations. • Use ADI (alternating direction implicit) to solve each 2-D equation: • Data transpose • 1-D, in-memory ADI sweep • Data transpose • 1-D, in-memory ADI sweep • Data transpose • Etc. • Inverse FFT Z R LSU 2004 Cactus Retreat

  22. Determine Newtonian Gravitational Accelerations(Three-dimensional Elliptic PDE on cylindrical mesh) • Perform FFT (in memory) in azimuthal coordinate direction  reduce to decoupled set of (256) two-dimensional Helmholtz equations. • Use ADI (alternating direction implicit) to solve each 2-D equation: • Data transpose • 1-D, in-memory ADI sweep • Data transpose • 1-D, in-memory ADI sweep • Data transpose • Etc. • Inverse FFT Z m LSU 2004 Cactus Retreat

  23. Determine Newtonian Gravitational Accelerations(Three-dimensional Elliptic PDE on cylindrical mesh) • Perform FFT (in memory) in azimuthal coordinate direction  reduce to decoupled set of (256) two-dimensional Helmholtz equations. • Use ADI (alternating direction implicit) to solve each 2-D equation: • Data transpose • 1-D, in-memory ADI sweep • Data transpose • 1-D, in-memory ADI sweep • Data transpose • Etc. • Inverse FFT R m LSU 2004 Cactus Retreat

  24. Visualize Results • Specify isodensity surface(s) • Find vertices and polygons on each surface (using marching cubes algorithm) • Write out vertices & polygons in “OBJ” format • Delete 3-D dataset • Utilize “Maya” to render nested surfaces (from pre-specified viewer orientation) • Write out TIFF image (typically 640 x 480) • Generate .mov LSU 2004 Cactus Retreat

  25. Future Algorithm • Select grid structure and resolution and [preferred AMR thorn] • [We] Construct initial configuration • [Let Cactus] Perform domain decomposition • While t < tstop • [Call GR Group’s thorn] Determine structure of space-time metric • [We (or Whisky thorn)] Push fluid around on the grid using Relativistic dynamics • If mod[ t , (orbital period/80) ] = 0 • Generate vertices and polygons in parallel • Spawn “Maya” rendering task on additional processor(s) • EndIf • [Call AMR thorn] Modify mesh, as necessary • EndWhile • [Use Cactus thorn] Write article and Publish results LSU 2004 Cactus Retreat

  26. Future Algorithm • Select grid structure and resolution and [preferred AMR thorn] • [We] Construct initial configuration • [Let Cactus] Perform domain decomposition • While t < tstop • [Call GR Group’s thorn] Determine structure of space-time metric • [We (or Whisky thorn)] Push fluid around on the grid using Relativistic dynamics • If mod[ t , (orbital period/80) ] = 0 • Generate vertices and polygons in parallel • Spawn “Maya” rendering task on additional processor(s) • EndIf • [Call AMR thorn] Modify mesh, as necessary • EndWhile • [Use Cactus thorn] Write article and Publish results LSU 2004 Cactus Retreat

  27. THE END LSU 2004 Cactus Retreat

More Related