1 / 13

Some thoughts on Extreme Scale Computing

Some thoughts on Extreme Scale Computing. David J. Dean Senior Advisor Under Secretary for Science Department of Energy. The tools have changed rapidly. These were our supercomputers in the 1970’s and 1980’s. 1986: X-MP/48 ~220 Mflop sustained 120-150kW (depending on model)

fadhila
Download Presentation

Some thoughts on Extreme Scale Computing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Some thoughts on Extreme Scale Computing David J. Dean Senior Advisor Under Secretary for Science Department of Energy

  2. The tools have changed rapidly These were our supercomputers in the 1970’s and 1980’s 1986: X-MP/48 ~220 Mflop sustained 120-150kW (depending on model) $40M for computer+disks (FY09$) Today: NNSA: Roadrunner at 1.105 PF (LINPACK) LANL; 2.5 MW Factor 1x107 in speed Factor of 18 in power SC/ASCR: Jaguar at 2.331 PF (LINPACK) ORNL; 6.9 MW

  3. Various DOE computing assets serving the DOE mission space Top 500 list, November 2009

  4. Leadership Computing: Scientific Progress at the Petascale Turbulence Understanding the statistical geometry of turbulent dispersion of pollutants in the environment. Nuclear EnergyHigh-fidelity predictive simulation tools for the design of next-generation nuclear reactors to safely increase operating margins. Energy StorageUnderstanding the storage and flow of energy in next-generation nanostructured carbon tube supercapacitors Fusion EnergySubstantial progress in the understanding of anomalous electron energy loss in the National Spherical Torus Experiment (NSTX). BiofuelsA comprehensive simulation model of lignocellulosic biomass to understand the bottleneck to sustainable and economical ethanol production. Nano ScienceUnderstanding the atomic and electronic properties of nanostructures in next-generation photovoltaic solar cell materials. All known sustained petascale science applications to date have been run on OLCF system

  5. Process for identifying exascale applications and technology for DOE missions ensures broad community input • Town Hall Meetings April-June 2007 • Scientific Grand Challenges Workshops November 2008 – October 2009 • Climate Science (11/08), • High Energy Physics (12/08), • Nuclear Physics (1/09), • Fusion Energy (3/09), • Nuclear Energy (5/09), • Biology (8/09), • Material Science and Chemistry (8/09), • National Security (10/09) • Cross-cutting workshops • Architecture and Technology (12/09) • Architecture, Applied Mathematics and Computer Science (2/10) • Meetings with industry (8/09, 11/09) • External Panels • ASCAC Exascale Charge • Trivelpiece Panel MISSION IMPERATIVES FUNDAMENTAL SCIENCE

  6. Simulation enables fundamental advances in basic science. ITER • High Energy Physics • Understanding of Dark Energy and Dark Matter • Testing QCD and physics beyond the standard model • Nuclear Physics • Unification of nuclear physics from quark-gluon plasma to basics of nucleon structure to nucleosynthesis • Fundamental understanding of fission and fusion reactions • Facility and experimental design • Effective design of accelerators • Probes of dark energy and dark matter • ITER shot planning and device control ILC Hubble image of lensing Structure of nucleons These breakthrough scientific discoveries and facilities require exascale applications and technologies.

  7. Computing applied to problems of National importance • Climate • Nuclear Energy • Smart Grid • Nuclear weapons • Materials under extremes • Combustion • Competitiveness Simulations are a key part to solutions See ASCAC (Rosner Committee) and Trivelpiece reports

  8. Example: Fundamental science challenge for nuclear energy systems • Microstructural evolution and phase stability • Mass transport, chemistry, and structural evolution at interfaces • Chemical behavior in actinide and fission-product solutes • Solution phenomena • Nuclear, chemical, and thermomechanical phenomena in fuels and waste forms • First-principles theory for ƒ-electron complexes and materials • Predictive capability across length and time scales • Material failure mechanisms Understand and control chemical and physical phenomena in multicomponent systems from femtoseconds to millennia, at temperatures to 1000°C and radiation doses to hundreds of displacements per atom Basic Research Needs for Advanced Nuclear Energy Systems, Gaithersburg (2006)

  9. Example: The next decade will see Nuclear Energy models spanning multiple time and length scales. Bridging length and time scales to resolve scientific unknowns [in nuclear energy] will require 3D simulations 100x standard resolution = A 10 Exaflop problem. Science-Based, Nuclear Energy Systems Enabled by Advanced Modeling and Simulation at the Extreme Scale

  10. Critical Exascale Technology Investments • System power is a first class constraint on exascale system performance and effectiveness. • • Memory is an important component of meeting exascale power and applications goals. • • Programming model. Early investment in several efforts to decide in 2013 on exascale programming model, allowing exemplar applications effective access to 2015 system for both mission and science. • • Investment in exascale processor design to achieve an exascale-like system in 2015. • • Operating System strategy for exascale is critical for node performance at scale and for efficient support of new programming models and run time systems. • • Reliability and resiliency are critical at this scale and require applications neutral movement of the file system (for check pointing, in particular) closer to the running apps. • • HPC co-design strategy and implementationrequires a set of a hierarchical performance models and simulators as well as commitment from apps, software and architecture communities.

  11. Co-design expands the feasible solution space to allow better solutions

  12. Exascale Computing • Need: • Enable dramatic advances in climate modeling, energy technologies, national security, and science via development of next-generation HPC • Challenges: • Next 1000x improvement in computing capability cannot be achieved by simply scaling up today’s hardware • Power consumption needs to be dramatically reduced to make exascale feasible • Millions of processors will present significant challenges for concurrency and resiliency • New programming models will be required to exploit new architectures • Applications, programming environment, and hardware must be co-developed • New architectures will require rethinking applications and programming environment from the ground up

  13. ExascaleComputing Path Forward • We have begun exploratory research efforts (FY10-11) • A concerted program would be: • Goal: exascalecapability by the end of the decade • Lab/industry/academic partnerships to begin hardware and programming environment R&D • Focus on key applications, including climate, nuclear security, Energy Simulation topics • High level coordination required to ensure multiple research programs are appropriately integrated

More Related