1 / 36

Ensuring Our Nation’s Energy Security

NCSX. Ensuring Our Nation’s Energy Security. Computational Challenges and Directions in the Office of Science Science for DOE and the Nation www.science.doe.gov. Fred Johnson Advanced Scientific Computing Research SOS7, March 2003. Outline.

malana
Download Presentation

Ensuring Our Nation’s Energy Security

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NCSX Ensuring Our Nation’s Energy Security Computational Challenges and Directions in the Office of Science Science for DOE and the Nation www.science.doe.gov Fred Johnson Advanced Scientific Computing Research SOS7, March 2003

  2. Outline • Background – Computational science in the Office of Science • SciDAC • Ultrascale Scientific Computing Capability • FY04: Next Generation Computer Architecture • FY05: The future revisited

  3. The Office of Science • Supports basic research that underpins DOE missions. • Constructs and operates large scientific facilities for the U.S. scientific community. • Accelerators, synchrotron light sources, neutron sources, etc. • Five Offices • Basic Energy Sciences • Biological and Environmental Research • Fusion Energy Sciences • High Energy and Nuclear Physics • Advanced Scientific Computing Research

  4. Computational Science is Critical to the Office of Science Mission Scientific problems of strategic importance typically: Involve physical scales that range over 5-50 orders of magnitude; Couple scientific disciplines, e.g., chemistry and fluid dynamics to understand combustion; Must be addressed by teams of mathematicians, computer scientists, and application scientists; and Utilize facilities that generate millions of gigabytes of data shared among scientists throughout the world. The Scale of the Problem Two layers of Fe-Mn-Co containing 2,176 atoms corresponds to a wafer with dimensions approximately fifty nanometers (50x 10-9m) on a side and five nanometers (5 x 10-9m) thick. A simulation of the properties of this configuration was performed on the IBM SP at NERSC. The simulation lasted for 100 hrs. at a calculation rate of 2.46 Teraflops (one trillion floating point operations per second). To explore material imperfections, the simulation would need to be at least 10 times more compute intensive.

  5. Outline • Background – Computational science in the Office of Science • SciDAC • Ultrascale Scientific Computing Capability • FY04: Next Generation Computer Architecture • FY05: The future revisited

  6. Scientific Discovery Through Advanced Computation (SciDAC) • SciDAC brings the power of terascale computing and information technologies to several scientific areas -- breakthroughs through simulation. • SciDAC is building community simulation models through collaborations among application scientists, mathematicians and computer scientists -- research tools for plasma physics, climate prediction, combustion, etc. • State-of-the-art electronic collaboration tools facilitate the access to these tools by the broader scientific community to bring simulation to a level of parity with theory & observation in the scientific enterprise.

  7. Introduction • SciDAC is a pilot program for a “new way of doing science” • spans the entire Office of Science (ASCR, BES, BER, FES, HENP) $37M 2M+ 8M+ 3M 7M • involves all DOE labs and many universities • builds on 50 years of DOE leadership in computation and mathematical software (EISPACK, LINPACK, LAPACK, BLAS, etc.)

  8. Addressing the Performance Gapthrough Software Peak performance is skyrocketing • In 1990s, peak performance increased 100x; in 2000s, it will increase 1000x But ... • Efficiency for many science applications declined from 40-50% on the vector supercomputers of 1990s to as little as 5-10% on parallel supercomputers of today • Need research on ... • Mathematical methods and algorithms that achieve high performance on a single processor and scale to thousands of processors • More efficient programming models for massively parallel supercomputers 1,000 Peak Performance 100 Performance Gap Teraflops 10 1 Real Performance 0.1 1996 2000 2004

  9. It’s Not Only Hardware! Updated version of chart appearing in “Grand Challenges: High performance computing and communications”, OSTP committee on physical, mathematical and Engineering Sciences, 1992.

  10. SciDAC Goals • an INTEGRATED program to: • (1) create a new generation of scientific simulation codes that take full advantage of the extraordinary capabilities of terascale computers • (2) create the mathematical and computing systems software to enable scientific simulation codes to effectively and efficiently use terascale computers • (3) create a collaboratory software environment to enable geographically distributed scientists to work effectively together as a TEAM and to facilitate remote access, through appropriate hardware and middleware infrastructure, to both facilities and data with the ultimate goal of advancing fundamental research in science central to the DOE mission

  11. CSE is Team-Oriented • successful CSE usually requires teams with members and/or expertise from at least mathematics, computer science, and (several) application areas • language and culture differences • usual reward structures focus on the individual • incompatible with traditional academia • SciDAC will help break down barriers and lead by example; DOE labs are a critical asset

  12. The Computer Scientist’s View Must have Fortran! Must have cycles! Must study climate! Must move data!

  13. Applications Scientist View Computer Scientist Complexity Applications Scientist

  14. Future SciDAC Issues • additional computing and network resources • initial SciDAC focus is on software, but new hardware will be needed within the next two years • both capability and capacity computing needs are evolving rapidly • limited architectural options available in the U.S. today • topical computing may be a cost-effective way of providing extra computing resources • math and CS research will play a key role • expansion of SciDAC program • many important SC research areas (e.g., materials/nanoscience, functional genomics/proteomics) are not yet included in SciDAC; NSRCs, GTL

  15. Outline • Background – Computational science in the Office of Science • SciDAC • Ultrascale Scientific Computing Capability • FY04: Next Generation Computer Architecture • FY05: The future revisited

  16. MotivationUltraScale Simulation Computing Capability Mission need:Energy production, novel materials, climate science, biological systems Systems too complex for direct calculation; descriptive laws absent. Involve physical scales up to 50 orders of magnitude; Several scientific disciplines, e.g., combustion; materials science Experimental data may be costly to develop, insufficient, inadequate or unavailable; and Large data files (millions of gigabytes) shared among scientists throughout the world. History of Accomplishments MPI, Math libraries, first dedicated high-performance computing center, SciDAC

  17. ASCAC Statement Without robust response to Earth Simulator, U.S. is open to losing its leadership in defining and advancing frontiers of computational science as new approach to science. This area is critical to both our national security and economic vitality. (Advanced Scientific Computing Advisory Committee – May 21, 2002).

  18. Application Simulation Need Sustained Computational Capability Needed (Tflops) Significance Climate Science Calculate chemical balances in atmosphere, including clouds, rivers, and vegetation. > 50 Provides U.S. policymakers with leadership data to support policy decisions. Properly represent and predict extreme weather conditions in changing climate. Magnetic Fusion Energy Optimize balance between self-heating of plasma and heat leakage caused by electromagnetic turbulence. > 50 Underpins U.S. decisions about future international fusion collaborations. Integrated simulations of burning plasma crucial for quantifying prospects for commercial fusion. Combustion Science Understand interactions between combustion and turbulent fluctuations in burning fluid. > 50 Understand detonation dynamics (e.g. engine knock) in combustion systems. Solve the “soot “ problem in diesel engines. Environmental Molecular Science Reliably predict chemical and physical properties of radioactive substances. > 100 Develop innovative technologies to remediate contaminated soils and groundwater. Astrophysics Realistically simulate the explosion of a supernova for first time. >> 100 Measure size and age of Universe and rate of expansion of Universe. Gain insight into inertial fusion processes. Simulation Capability NeedsFY2004-05 Timeframe

  19. Key Ideas • Deliver a full-suite of leadership class computers for science with broad applicability. • Establish a model for computational sciences (SciDAC and base programs) that couples applications scientists, mathematicians, and computational and computer scientists with computer designers, engineers, and semiconductor researchers. • Develop partnerships with domestic computer vendors to ensure that leadership class computers are designed, developed, and produced with science needs as an explicit design criterion. • Partner with other agencies. • Partner with industry on applications.

  20. FY 2004 Request to OMBUSSCC UltraScale Scientific Computing Capability Supporting R&D – 30% • Research with Domestic Vendors – Develop ultrascale hardware and software capabilities for advancing science, focusing on faster interconnects and switches. • Continue 2 partnerships begun in FY2003 • Initiate additional partnerships ( up to 3) in FY2004, based on competitive review • Operating Systems, Software Environments, and Tools • Address issues to ensure scalability of operating systems to meet science needs • Develop enhanced numerical libraries for scientific simulations • Develop tools to analyze application performance on ultrascale computer systems • University-based Computer Architecture Research – Explore future generations of computer architectures for ultrascale science simulation.

  21. FY 2004 Request to OMBUSSCC UltraScale Scientific Computing Capability Computing and Network Facilities- 70% • Computer architecture evaluation partnerships- Evaluate computer architectures at levels to ensure that computer hardware and systems software balanced for science and likely to successfully scale • Continue partnership established in FY2002 between ORNL and Cray, Inc. • Initiate one new partnership, comprised of scientists and engineers from a domestic computer vendor, with computer scientists, and applications scientists supported by the Office of Science. • Award partnership from a competition among invited vendors • Begin installation of first ultrascale computing system for science

  22. Outline • Background – Computational science in the Office of Science • SciDAC • Ultrascale Scientific Computing Capability • FY04: Next Generation Computer Architecture • FY05: The future revisited

  23. Next Generation Computer Architecture • Goal: Identify and address major hardware and software architectural bottlenecks to the performance of existing and planned DOE science application • Main Activities • Architecture impacts on application performance • OS/runtime research • Evaluation testbeds

  24. Outline • Background – Computational science in the Office of Science • SciDAC • Ultrascale Scientific Computing Capability • FY04: Next Generation Computer Architecture • FY05: The future revisited

  25. How full is the glass? • Support and enthusiasm within the Office of Science • Office of Science Strategic Plan • Interagency cooperation/coordination • NSA SV2 • DOD IHEC • DARPA HPCS • NNSA: program reviews, open source, NAS study, Red Storm,… • DARPA/DOD/SC USSCC meeting • OSTP/NNSA/DOD/SC NGCA meeting • OSTP support • International coordination • Hawaii meeting • ES benchmarking

  26. Agency Coordination Overview Matrix

  27. NNSA Details

  28. NSA Details

  29. DOD and DARPA Details

  30. DARPA High ProductivityComputing Systems Program (HPCS) Goal: • Provide a new generation of economically viable high productivity computing systems for the national security and industrial user community (2007 – 2010) Impact: • Performance (efficiency): critical national security applications by a factor of 10X to 40X • Productivity (time-to-solution) • Portability (transparency): insulate research and operational application software from system • Robustness (reliability): apply all known techniques to protect against outside attacks, hardware faults, & programming errors HPCS Program Focus Areas • Applications: • Intelligence/surveillance, reconnaissance, cryptanalysis, weapons analysis, airborne contaminant modeling and biotechnology Fill the Critical Technology and Capability Gap Today (late 80’s HPC technology)…..to…..Future (Quantum/Bio Computing)

  31. Early Computing Metrics • Clock frequency • Raw performance (flops) GHz Race Current Computing Metrics • Clock frequency • Point performance • Acquisition Price Tera-flop Race (Top Ten HPC Centers) HPCS “Value” Based Metrics • System performance relative-to- application diversity • Scalability (flops-to-petaflops) • Idea-to-solution • Time-to-solution • Mean time-to-recovery • Robustness (includes security) • Evolvability • Application life cycle costs • Acquisition (facilities and equipment) costs • Ownership (facilities, support staff, training) costs Computing Metric Evolution

  32. Memory System Performance LimitationsWhy applications with limited memory reuse perform inefficiently today • STREAMS ADD: Computes A + B for long vectors A and B (historical data available) • New microprocessor generations “reset” performance to at most 6% of peak • Performance degrades to 1% - 3% of peak as clock speed increases within a generation • Goal: benchmarks that relate application performance to memory reuse and other factors

  33. Phase I HPCS Industry Teams • Cray, Incorporated • International Business Machines Corporation(IBM) • Silicon Graphics, Inc. (SGI ) • Sun Microsystems, Inc. • Hewlett-Packard Company

  34. The Future of Supercomputing • National Academy CSTB study • Co-funded by ASCR and NNSA • 18 month duration • Co-chairs: Susan Graham, Marc Snir • Kick-off meeting 3/6/03 • The committee will assess the status of supercomputing in the United States, including the characteristics of relevant systems and architecture research in government, industry, and academia and the characteristics of the relevant market. … • http://www.cstb.org/project_supercomputing.html

  35. High End Computing Revitalization Task Force • OSTP interagency thrust • HEC an administration priority for FY05 • Task Force to address: • HEC core technology R&D • Federal HEC capability, capacity and accessibility • Issues related to Federal procurement of HEC systems • “It is expected that the Task Force recommendations will be considered in preparing the President’s budget for FY2005 and beyond.” • Kick-off meeting March 10, 2003 • Co-chairs: John Grosh, DOD and Alan Laub, DOE

  36. Links • SciDAC • http://www.osti.doe.gov/scidac • Genomes to Life • http://www.doegenomestolife.org/ • Nanoscale Science, Engineering, and Technology Research • http://www.sc.doe.gov/production/bes/NNI.htm • http://www.science.doe.gov/bes/Theory_and_Modeling_in_Nanoscience.pdf • UltraScale Simulation Planning • http://www.ultrasim.info/

More Related