1 / 41

Strategic Applications to Drive Strategic Technologies for the 21st Century

Strategic Applications to Drive Strategic Technologies for the 21st Century. Keynote Talk at the RCI Annual Member Management Executive Conference Arlington, Virginia October 13, 1998. The Emerging Concept of a National Scale Information Power Grid.

fuller
Download Presentation

Strategic Applications to Drive Strategic Technologies for the 21st Century

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Strategic Applications to Drive Strategic Technologies for the 21st Century • Keynote Talk at the RCI Annual Member Management Executive Conference • Arlington, Virginia October 13, 1998.

  2. The Emerging Concept of a National Scale Information Power Grid Imagine a national computing and information infrastructure that allowed everyone to access the information resources of the nation in much the same way that one accesses electrical power today -- an “Information Power Grid” -- NASA http://science.nas.nasa.gov/Groups/Tools/IPG

  3. The Grid Links People with Distributed Resources on a National Scale http://science.nas.nasa.gov/Groups/Tools/IPG

  4. The Grid - “Dependable, Consistent, Pervasive Access to [High-end] Resources” • Dependable: • Can Provide Performance and Functionality Guarantees • Consistent: • Uniform Interfaces to a Wide Variety of Resources • Pervasive: • Ability to “Plug In” From Anywhere Source: Ian Foster, ANL

  5. The Grid:Blueprint for a New Computing InfrastructureIan Foster, Carl Kesselman (Eds),Morgan Kaufmann, 1999 • Available July 1998; ISBN 1-55860-475-8 • 22 chapters by expert authors including: • Andrew Chien, • Jack Dongarra, • Tom DeFanti, • Andrew Grimshaw, • Roch Guerin, • Ken Kennedy, • Paul Messina, • Cliff Neuman, • Jon Postel, • Larry Smarr, • Rick Stevens, • Charlie Catlett • John Toole • and many others “A source book for the history of the future” -- Vint Cerf http://www.mkp.com/grids

  6. The National Computational Science Alliance www.ncsa.uiuc.edu

  7. The National Partnership for Advanced Computational Infrastructure www.npaci.edu

  8. The National Science Foundation’s vBNS - Topology October 1998 Source: R. Patterson, R. Butler, NCSA-NLANRb

  9. 27 Alliance sites running... …19 more in progress. Indiana University Abilene NOC 1999: Expansion via Internet2 -- Abilene vBNS & Abilene at 2.4 Gbit/s Assembling the Links in the Gridwith NSF’s vBNS Connections Program NCSA runs NLANR Distributed Applications Support Team for vBNS StarTAP NCSA vBNS Backbone Node vBNS Connected Alliance Site vBNS Alliance Site Scheduled for Connection Source: Charlie Catlett, Randy Butler, NCSA Grid Team

  10. Qwest Nationwide Network -Backbone for Internet2 Abilene - More Links Qwest Partnering with Cisco and Nortel http://www.qwest.net/network/Mainmaps.html Source: Randy Butler, NCSA

  11. Grid EnabledWorkshop and Training Facilities Being Deployed Across the Alliance Jason Leigh and Tom DeFanti, EVL; Rick Stevens, ANL

  12. New Alliance Center for Collaboration, Education, Science, and Software (ACCESS) • 7000 Square Feet in the Wash D.C. Metropolitan Area - Construction Completed Sept ‘98 • Remote and Local Access to Alliance Technologies, Leaders, Researchers, and Partners • Collaborative Demonstrations, Training, Digital Video Teleconferencing, and Visitors • Immersadesk, Power Wall, Projected Advanced High Bandwidth Applications • Initiated Using State of Illinois Cost Sharing • FY99 -- Extend Access Centers Throughout the Alliance

  13. Supercomputers, Networks, and Virtual Reality -- From the Heroic to the Routine • Virtual Director in CAVE for Choreography of Data • 1000 Hour SDSC Supercomputer Run to Generate Data • Tens of Thousands of Hours of NCSA SGI Time to Render Data • Cross-Country Transfer to IMAX Film of Massive Amounts of Data Colliding Galaxies (Smithsonian IMAX)-Donna Cox, Bob Patterson, NCSA-From “Cosmic Voyage”-Nominated for Academy Award 1997

  14. The Grid Links Remote Sensors With Supercomputers, Controls, & Digital Archives • Creating Remote Super Telescopes • BIMA and NRAO • Collaborative Web Interface • Real Time Control and Steering Starburst Galaxy M82 Alliance Scientific Instrument Applications Team

  15. Using the Grid to Create Super-Biomedical Instruments www.npaci.edu/Research/index.html NPACI Neurosciences Research Thrust

  16. User Web Browser User Input Output to User Results to User User Instructions and queries Application Programs (May have varying interfaces and be written in different languages) Information Sources (May be of varying formats) Workbench Server Instructions Queries Format Translator, Query Engine and Program Driver Results Information The NCSA Information Workbench - An Architecture for Web-Based Computing NCSA Computational Biology Group

  17. Genomes Populations& Evolution Structure & Function Gene Products Pathways & Physiology Ecosystems Using a Web Browser to Run Programs and Analyze Data Worldwide NCSA Biology Workbench Has Over 6,000 Users From Over 20 Countries

  18. O(km) O(cm) O(nm) O(m) Alliance Chemical Engineering TeamDeveloping the Chemical Engineer’s Workbench Couple Supercomputer Models For All These Scales Together Access From a Web Browser!

  19. Measurements and ExperimentalDesign Process Parameter Estimation Control Signals Plant-wide Control Using the Grid to Optimize Chemical Plant Operations Process Data Grid Coupling: Sensors Networks Data HPC Models Controls Process Model Alliance Chemical Engineering Applications Team

  20. Status of Simulation Interactive Discussion Detailed Visualization Current parameters in solution Reactor Simulation Goal-Create Collaborative Interface to Link Multiple Investigators With the Grid Ken Bishop, U Kansas Using NCSA Habanero

  21. The Killer Application for the Grid -Collaborative Tele-Immersion CAVE ImmersaDesk Different Physical Implementations of the Alliance CAVE Software Libraries Image courtesy: Electronic Visualization Laboratory, UIUC

  22. Using the vBNS to Link Alliance Virtual Reality Devices Image by Robert Patterson, NCSA

  23. Environmental Hydrology Collaboration: From CAVE to Desktop Using Java and Java3d to Bring Collaboration and CAVE Capabilities To the Desktop Java Port of Cave5D, Enhanced With Java3D, Wand Control and Flock-of-birds Position Tracking Using NT Pietrowicz/NCSA-LES; Hibbard/Wisconsin

  24. A Working Model-Caterpillar’s Collaborative Virtual Prototyping Environment Real Time Linked VR and Audio-Video Between NCSA, Peoria, Houston, and Germany

  25. Designer ATM/IP Network Customer Manufacturing Supplier Facility Goal-Global Enterprise Management

  26. Bringing the Grid to the Virtual Battlefield NCSA, Beckman Institute, Army Research Lab

  27. The Continuing Exponential Agent of Change 1985 1998 SGI Origin (128 Processors) 100x 10,000x 100x Parallelism Cray X-MP (2-processors) 1x 100x Vectors 100x Compaq Desktop (2-processors) Pentium II 333 Mhz IBM PC/AT (1-processor)

  28. Development of Computational Methods in Chemistry Awarded the 1998 Nobel Prize for Chemistry • Walter Kohn • University of California at Santa Barbara, USA • John A. Pople • Northwestern University, Evanston, Illinois, USA • “to Walter Kohn for his development of the density-functional theory and to John Pople for his development of computational methods in quantum chemistry” Freon and Ozone www.nobel.se/announcement-98/chemistry98.html

  29. The Evolution of Shared Memory Parallel & Distributed Computing Vector SMPs to Microprocessor SMPs to Clustered Microprocessor SMPs to Microprocessor DSMs to Clustered Microprocessor DSMs 1985 1993 1995 1997 1999 Alliance LES

  30. Clustered Shared Memory Computers are Today’s High End NCSA has 6 x 128 Origin Processors ASC has 4 x 128 ARL has 3 x 128 CEWES has 1 x 128 NAVO has 1 x 128 Los Alamos ASCI Blue Will Have 48 x 128! Livermore ASCI Blue has 1536x4 IBM SP

  31. Evolution of a Red Giant with White Dwarf Core- How High Speed Networks Enhance Analysis 128-processor SGI Origin Run for One Week Generated Terabytes of Data Surface View Data Moved From NCSA over vBNS to U Minnesota- Visualization at SC97 While Week-Long Simulation Runs at NCSA vBNS Gives 500-Fold Thruput Increase Over Commercial Internet! Porter, Anderson, Habermann, Ruwart, & Woodward , LCSE,Nov. 1997 www.lcse.umn.edu/RedGiant/

  32. Storm and Mesoscale Ensemble Experiment 1998 - Center for Analysis and Prediction of Storms Kelvin Droegemeier, Univ. of Oklahoma, Director, CAPS Run in Morning - Compare with Reality in the Afternoon! CAPS Collaborators: NCAR NSSL AFWA NCEP Ran on PSC T3D-512 http://origin.caps.ou.edu/~samex/arps/19980524/12Z_nc9/13h/refl-2km.gif

  33. Northeast Northern Great Plains Southeast Predicting Spring Storms in 1999 and Beyond -- A Grid Based Computational Science Experiment • NCSA and Regional Models Running Concurrently • Local NEXRAD Doppler Radars to Initialize Models • Models Accessed Over Web NCSA Requirement: 5 Hr./day on Origin 128 Lasting Two Months Spring ‘99 Kelvin Droegemeier, Director CAPS, Univ. Oklahoma

  34. Alliance Middleware for the Grid GII Next Generation Winner SF Express-- Synthetic Theatre of War Simulation Multi-site-DARPA, DOE, DOD Mod and NSF PACI Largest Distributed Interactive Simulation Ever Performed 100,000 vehicle simulation Tanks, Fighting Vehicles, Armored Personnel Carriers, Trucks 1386 processors on 13 computers at 9 sites Globus Ubiquitous Supercomputing Testbed (GUSTO)

  35. Quantum Simulations Using WebFlow - a High Level Visual Interface for Globus Now Co-Funded by Sun Microsystems • Alliance • Distributed Computing ET Team • Nanomaterials AT Team E. Akarsu, G. Fox, W. Furmanski, T.Haupt (NPAC, Syracuse U), L. Mitas (NCSA)

  36. Harnessing the Unused Cycles of Networks of Workstations Alliance Nanotechnologies Team Used Univ. of Wisconsin Condor Cluster - Burned 1 CPU-Year in Two Weeks! Condor Cycles University of Kansas is Installing Condor

  37. NT Workstation Shipments Rapidly Surpassing UNIX Source: IDC, Wall Street Journal, 3/6/98

  38. Terabyte “Smart Bucket” Parallel Compute Cluster Visualization Stations NCSA / AllstateNT Cluster Data Refinery 1000 Gigabytes of Allstate Claims Data CompaqNT Server CompaqNT Server External Networks Data Mine on Cleaned Gigabyte Samples Source: Allstate & Tilt Thompkins, NCSA

  39. Creating Scalable NT/Intel Servers “Supercomputer performance at mail-order prices”-- Jim Gray, Microsoft • Andrew Chien, CS UIUC-->UCSD • Rob Pennington, NCSA • Myrinet Network, HPVM, Fast Messages • Microsoft NT OS, MPI API 192 Hewlett Packard 300 MHz 64 Compaq 333 MHz

  40. Solving 2D Navier-Stokes Kernel - Performance of Scalable Systems Preconditioned Conjugate Gradient Method With Multi-level Additive Schwarz Richardson Pre-conditioner (2D 1024x1024) Various Applications Sustaining 7 GF on 128 Processors NT Supercluster Danesh Tafti, Rob Pennington, NCSA; Andrew Chien (UIUC, UCSD)

  41. The Road to Intel’s MercedThe Convergence of Scientific and Commercial Computing IA-64 Co-Developed by Intel and Hewlett-Packard http://developer.intel.com/solutions/archive/issue5/focus.htm#FOUR

More Related