1 / 8

National Vision for High Performance Computing

National Vision for High Performance Computing. Before the ORAU Forum on High Performance Computing March 10, 2004. Raymond L. Orbach, Director, Office of Science.

Download Presentation

National Vision for High Performance Computing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. National Vision for High Performance Computing Before the ORAU Forum on High Performance Computing March 10, 2004 Raymond L. Orbach, Director, Office of Science

  2. “This revolution is caused by two developments: one is the set of instruments such as electron microscopy, synchrotron x-ray sources, lasers, scanning microscopy, and nuclear magnetic resonance devices; the other is the availability of powerful computing and information technology.  Together these have brought science finally within reach of a new frontier, the frontier of complexity.”  --President’s Science Advisor Jack Marburger before the AAAS in February, 2002 “In science of the 21st century, simulation and high-end computation are equal partners with theory and experiment. Scientific leadership, the basis for our economic, physical, and intellectual prosperity depends on this triad, our being first in each component.” --Dr. Raymond L. Orbach before the Rand Graduate School, June 22, 2002

  3. The science of the future demands that we advance beyond our current computational abilities. Accordingly, we must address the following questions: What new mathematics are required to effectively model systems such as the Earth’s climate or the behavior of living cells that involve processes taking place on vastly different time and/or length scales? Which computational architectures and platforms will deliver the most benefit for the science of today and the science of the future? What advances in computer science and algorithms are needed to increase the efficiency with which supercomputers solve problems for the Office of Science? What operating systems, data management, analysis, model development, and other tools are required to make effective use of future-generation supercomputers? Can we develop a new sociology to fully utilize high-end computational power?

  4. WHAT IS THE OFFICE OF SCIENCE DOING? The Office of Science is working to build leadership class machines that will exceed 50 sustained teraflops to begin to simulate and understand the fundamental physical, chemical and biological processes of natural and manmade systems. To meet the need for effective computing performance in the 100-teraflop range and beyond, the Office of Science will support the evaluation, installation, and application of new very high-end computing architectures for computational science. These high-performance computers will provide a new window for researchers to observe the natural world at a fidelity that could only be imagined a few years ago.

  5. The earth simulator works on a grid 10 km on a side for climate models, while U.S. computers do no better than 100 km on a side. U.S. simulations average over microclimates — mountains and coastal effects, river flow, cloud and storm systems, or hurricane storms. Averaging means that our models cannot credibly predict large scale fluctuations in climate change, critical for long-term drought and flood predictions. 50 Tflops would enable much higher resolution for U.S. climate simulations. We can use simulation to design coal burning boilers that dramatically could reduce NOx emissions by almost one million tons per year in the United States alone.. Simulations are needed with increased fidelity in both chemical kinetics and spatial resolution. We can develop simulations of burning plasmas, including ITER, that include electron dynamics and will enable us to predict and avoid major disruptions. Realistic simulations of disruptions in ITER would require several orders of magnitude increase in the number of space and time points for the calculation as well as additional physical properties, most importantly a model of the plasma–wall (divertor) interaction. We can computationally determine the acoustic signature of an airplane in one day. Replacement of physical prototypes by computation can save industry tens of millions of dollars and years of development time WHAT HAPPENS AT 50 TFLOPS SUSTAINED SPEED?

  6. INCITE 10% of NERSC capability made available to the scientific community for high impact science. No requirement for direct relationship to Office of Science programs. INCITE will encourage the development of a new sociology for high-end computation. • 4,500,000 CPU hours available (10% of NERSC Capability) • 53 proposals received • 130,508,660 CPU hours requested • 65% from U.S. academic institutions • 12 different scientific disciplines • 62% for research not funded by DOE • Three Awards • “Thermonuclear Supernovae: Stellar Explosions in Three Dimensions,” Tomasz Plewa, Center for Astrophysical Thermonuclear Flashes, University of Chicago (2.7 million processor hours). • “Fluid Turbulence and Mixing at High Reynolds Number,”  Professor P. K. Yeung, Georgia Institute of Technology (1.2 million processor hours). • “Quantum Monte Carlo Study of Photoprotection via Carotenoids in Photosynthetic Centers,” William A. Lester, Jr., Lawrence Berkeley National Laboratory and the University of California Berkeley ( 1 million processor hours).

  7. NGA COMPETITION On February 23, 2004 the Office of Advanced Scientific Computing Research (ASCR) of the Office of Science (SC), U.S. Department of Energy (DOE) announced its interest in receiving applications for leadership-class scientific computing capability in support of both the ASCR and the broader SC research programs; as well as other capability-limited federally-funded computational science activities. • $25 Million available for single award in FY 2004 with an additional $25 million in the President’s FY 2005 budget request • Proposals Due April 2, 2004 • Award announced on or about April 15, 2004

  8. THE PATH FORWARD • A multi-agency High-End Computing Revitalization Task Force organized by the Office of Science and Technology Policy is developing a roadmap to revitalize U.S. supercomputing. Its report is due to the President’s Science Advisor shortly. • We have strengthened our partnerships with the Defense Advanced Research Projects Agency and the National Nuclear Security Agency through a Memorandum of Understanding to ensure the best evaluation of the effectiveness of existing and future architectures for various types of scientific problems. • The Office of Science will work to determine potential effects of advances in high end computing. In particular, we will address: • * What are scientific needs and priorities for high performance computing? • * How well do these requirements match existing and future system architectures ? • * What are the opportunities for expanded HPC applications? • * What are the drivers or constraints to expanded HPC usage?

More Related