1 / 22

Organization of Talk: TSI Project Description TSI-ISIC Collaborations Ongoing TSI-SDM Projects

http://www.phy.ornl.gov/tsi/. Organization of Talk: TSI Project Description TSI-ISIC Collaborations Ongoing TSI-SDM Projects A Look Ahead. Investigator Team. Cross-Cutting Team Long-Term Collaborations Structured like SciDAC. TOPS.

meriel
Download Presentation

Organization of Talk: TSI Project Description TSI-ISIC Collaborations Ongoing TSI-SDM Projects

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. http://www.phy.ornl.gov/tsi/ • Organization of Talk: • TSI Project Description • TSI-ISIC Collaborations • Ongoing TSI-SDM Projects • A Look Ahead

  2. Investigator Team • Cross-Cutting Team • Long-Term Collaborations • Structured like SciDAC TOPS • Linear System/Eigenvalue Problem Solution Algorithms for Radiation Transport and Nuclear Structure Computation • Dongarra (UT, ORNL) • Saied (UIUC, NCSA) • Saylor (UIUC, NCSA) • Radiation Transport/ • Radiation Hydrodynamics • Blondin (NC State) • Bruenn (FAU) • Hayes (UCSD) • Mezzacappa (ORNL) • Swesty (SUNYSB) • Supernova Science • Blondin • Bruenn • Fuller • Haxton • Hayes • Lattimer • Meyer (Clemson) • Mezzacappa • Swesty TOPS CCA PERC TSTT • Nuclear Structure Computations • for EOS and Neutrino-Nucleus/ • Nucleon Interactions • Dean (ORNL, UT) • Fuller (UCSD) • Haxton (INT, Washington) • Lattimer (SUNYSB) • Prakash (SUNYSB) • Strayer (ORNL, UT) SDM • Visualization • Baker (NCSA) • Toedte (ORNL)

  3. Goal • Ascertain the explosion mechanism(s). • Reproduce supernova phenomenology (element synthesis; neutrino, gravitational wave, and gamma ray signatures; neutron star kicks; gamma ray burst connection) • Relevance • Dominant source of many elements in the Universe. • Given sufficiently well developed models, serve as laboratories for fundamental nuclear and particle physics that cannot be explored in terrestrial laboratories. • Driving application in computational science (radiation transport, hydrodynamics, nuclear physics, applied mathematics, computer science, visualization). • Paradigm • Result from stellar core collapse and bounce in massive stars. • Radiatively driven (perhaps some are • MHD driven, or both).

  4. Convection • Need Boltzmann Solution • Need Angular Distribution • Need Spectrum • “Gray” Schemes Inadequate • Spectrum Imposed • Limited Angular Information • (Few Moments) • Parameterized • (No First Principle Solution) • The bar is high! (10% effects can make or break explosions.)

  5. 0D 0D 1D 1D Neutrino Energy Lightbulb D FLD MGFLD MGBT Space Burrows, Hayes, and Fryxell Janka and Mueller 1D Mezzacappa et al. Herant et al. TSI Year 1 TSI Year 2 2D Swesty Fryer and Heger Past Transport in 2D Models D: Diffusion FLD: Flux-Limited Diffusion MGFLD: Multigroup FLD MGBT: Boltzmann Transport TSI Year 2 TSI Year 3 3D Gray Models

  6. Explosion Mechanism: Open Questions • What is theRecipe for Explosion? Neutrino Heating Convection General Relativity Rotation MagneticFields • Are there multiple mechanisms? • Neutrino-driven supernovae • MHD-driven supernovae • Supernovae driven by both neutrinos and MHD effects • One mechanism for a class of stars? • Is the mechanism tailored to the individual star?

  7. Supernova Science Hydrodynamics Explicit Differencing Reactive Flows Newtonian General Relativistic Nuclear, Weak Interaction Physics Thermodynamics (Composition), Neutrino Sources and Interactions Radiation Transport Implicit Differencing MGFLD Preconditioners Sparse System Solvers MGBT Preconditioners Sparse System Solvers (Matrix Free)

  8. ISIC Collaborations: TOPS • Nonlinear Algebraic Equations • Linearize • Solve via Multi-D Newton-Raphson Method • Large Sparse Linear Systems Boltzmann Equation nonlinear integro-PDE • Implicit Time Differencing • Extremely Short Neutrino-Matter Coupling Time Scales • Neutrino-Matter Equilibration • Neutrino Transport Time Scales Memory Requirements (assuming matrix-free methods): 10s Gb up to 1/2 Tb Progress: Sparse Approximate Inverses for 2D MGFLD (Saylor, Smolarski, Swesty; J. Comp. Phys.) ADI-Like Preconditioner for Boltzmann Transport (D’Azevedo et al.; Precond 2001, NLAA) AGILE-BOLTZRAN, V2D codes turned over to TOPS for analysis and development.

  9. ISIC Collaborations: CCTTSS TSI Code: • F90 + MPI Code • Object-Oriented Design for Interoperability and Reuse • Application Framework: IBEAM = Interoperability Based Environment for Adaptive Meshes • NASA HPCC-Funded Project (PI: Swesty) • AMR: PARAMESH • Goal: Develop our framework to be CCA-compliant. • Initiated discussions with ANL, LLNL, and ORNL members of CCTTSS.

  10. ISIC Collaborations: PERC • Assess Code Performance on Parallel Platforms • Identify Code Optimizations to Increase Performance • TSI Code Suite • Hydrodynamics: • VH-1 (PPM) • ZEPHYR (Finite Difference) • Neutrino Transport: • AGILE-BOLTZTRAN: 1D General Relativistic Adaptive Mesh • Hydrodynamics with 1D Boltzmann Transport • V2D: 2D MGFLD Transport Code • V3D: 3D MGFLD Transport Code (Under Development) • 2D/3D Boltzmann Code (Under Development) VH-1 numerical hydrodynamics algorithm scales well. Results for VH-1

  11. ISIC Collaborations: TSTT Adaptive Quadratures (Direction Cosines) for Multidimensional Radiation Transport • Greatest challenge to completing 3D Boltzmann • simulations is memory. • Minimize number of quadratures to minimize • memory needs while maintaining physical • resolution. (Also important for 1D/2D MGBT.) • Optimization Problem Results for 1D Boltzmann Transport on Milne Problem (D’Azevedo): Extended Core Compact Core

  12. Collaboration with Supporting Base Projects: Networking • Identify Optimal Paths in Our Collaborative Visualization Server-Client Model • Maximize Bandwidth along these Paths (Not Achieved Using Current Protocols) • Participated in ORNL Workshop on DoE High-Performance Network R&D and Applications • Convey TSI Needs to Networking Team • Participate in White Paper to Define and Develop Interface between Efforts

  13. ISIC Collaborations: SDM • Use PROBE environment for staging data between simulation platforms and • end-user visualization platforms. • Develop new data analysis techniques/tools tailored to our application, allowing • (a) data reduction and (b) discovery potential. • Use of agent technology for distributed data analysis (data analysis must be • done in parallel to achieve reasonable throughputs).

  14. Latest TSI 2D/3D Models: • Hydrodynamics only. • Focused on understanding 2D/3D flow and its • coupling to shock wave. • Convectively stable. • 2D model exhibits bipolar explosion(due to nonlinear flow-shock interaction). • 3D model exhibits similar “long-wavelength” • behavior. Key finding. • New “rolling” flows identified. AAS Meeting; Ap.J. Submitted 2D Model 3D Model

  15. SDM: Data Needs • 3D Hydrodynamics Run • 5 Variables (Density, Entropy, Three Fluid Velocities) • 1024 X 1024 X 1024 Cartesian Grid • 1000 Time Steps 20 Terabyte Dataset

  16. Probe Production SDM: PROBE Bulk Storage IBM and Compaq Supercomputers Production HPSS Probe HPSS CAVE Data Reduction, pre-Vis Manipulation Rendering Stingray RS/6000 S80 Origin 2000 Reality Monster Marlin RS/6000 H70 Other Probe Nodes External Esnet Router Utilize PROBE until data manipulations, partitioning of manipulations, and bandwidths are known.PROBE is adaptable!

  17. SDM: Data Analysis • Data Reduction • Scientific Discovery Density distribution at last slice reconstructed from 30 principal components. (First slice reconstructed from 3!) Original density distribution at final time “slice.”

  18. Integrate into collaborative visualization? PCA Data Reconstruction Server Client PCA Data Reduction Networking Technologies

  19. New Windows on the Universe? • Scientific Discovery • Can we use current data analysis tools to better understand and better quantify supernova physics? • Can we develop new tools that will provide a new view of supernova physics?

  20. SDM: Agents Team of Agents Divides Up Data • Current data analysis techniques performed on • a 10 Gb dataset would take 3 years to complete! • Need for distributed data analysis. • Agents perform analysis on subsets of data. • Merge results via peer-to-peer agent collaboration and negotiation. Both data analysis and visualization can employ agent technology. + GUI/environment for the selection and (distributed) use of data analysis tools and the display of pre- and post-processed data.

  21. SDM: A Look Ahead • We have a testbed! • Existing 2D/3D datasets. • Three TSI nodes: NCSA, NCSU, ORNL. • Testbed for collaborative visualization tools. • Testbed for networking. • PROBE being used to postprocess the data. • PCA has been used successfully for data reduction. • Agents have been used in a cross-platform demo utilizing this data. • Continue to explore possibilities. • Continue extensive interactions between TSI modelers and • SDM data analysts. • Can we integrate data analysis and agent technology (distributed data • analysis) with collaborative visualization? • Will existing tools/new tools lead to scientific discovery? • New views on the data? • Better quantification of supernova dynamics?

More Related