1 / 21

TeraGrid Science Gateways

TeraGrid Science Gateways. Nancy Wilkins-Diehr TeraGrid Area Director for Science Gateways wilkinsn@sdsc.edu. TeraGrid is one of the largest investments in shared CI from NSF’s Office of Cyberinfrastructure. Dedicated high-speed, cross—country network. Staff & Advanced Support.

xuxa
Download Presentation

TeraGrid Science Gateways

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. TeraGrid Science Gateways Nancy Wilkins-Diehr TeraGrid Area Director for Science Gateways wilkinsn@sdsc.edu EGU, Vienna, May 3, 2010

  2. TeraGrid is one of the largest investments in shared CI from NSF’s Office of Cyberinfrastructure Dedicated high-speed, cross—country network Staff & Advanced Support 20 Petabytes Storage 2 PetaFLOPS Computation Visualization EGU, Vienna, May 3, 2010

  3. TeraGrid resources today include: • Tightly Coupled Distributed Memory Systems, 2 systems in the top 10 at top500.org • Kraken (NICS): Cray XT5, 99,072 cores, 1.03 Pflop • Ranger (TACC): Sun Constellation, 62,976 cores, 579 Tflop, 123 TB RAM • Shared Memory Systems • Cobalt (NCSA): Altix, 8 Tflop, 3 TB shared memory • Pople (PSC): Altix, 5 Tflop, 1.5 TB shared memory • Clusters with Infiniband • Abe (NCSA): 90 Tflops • Lonestar (TACC): 61 Tflops • QueenBee (LONI): 51 Tflops • Condor Pool (Loosely Coupled) • Purdue- up to 22,000 cpus • Gateway hosting • Quarry (IU): virtual machine support • Visualization Resources • TeraDRE (Purdue): 48 node nVIDIA GPUs • Spur (TACC): 32 nVIDIA GPUs • Storage Resources • GPFS-WAN (SDSC) • Lustre-WAN (IU) • Various archival resources • New systems: • Data Analysis and Vis systems • Longhorn (TACC): Dell/NVIDIA, CPU and GPU • Nautilus (NICS): SGI UltraViolet, 1024 cores, 4TB global shared memory • Data-Intensive Computing • Dash (SDSC): Intel Nehalem, 544 processors, 4TB flash memory • FutureGrid • Experimental computing grid and cloud test-bed to tackle research challenges in computer science • Keeneland • Experimental, high-performance computing system with NVIDIA Tesla accelerators EGU, Vienna, May 3, 2010 Source: Dan Katz, U Chicago

  4. So how did the Gateway program develop?A natural result of the impact of the internet on worldwide communication and information retrieval • Implications on the conduct of science are still evolving • 1980’s, Early gateways, National Center for Biotechnology Information BLAST server, search results sent by email, still a working portal today • 1989 World Wide Web developed at CERN • 1992 Mosaic web browser developed • 1995 “International Protein Data Bank Enhanced by Computer Browser” • 2004 TeraGrid project director Rick Stevens recognized growth in scientific portal development and proposed the Science Gateway Program • Today, Web 3.0 and programmatic exchange of data between web pages • Simultaneous explosion of digital information • Growing analysis needs in many, many scientific areas • Sensors, telescopes, satellites, digital images, video, genomic sequencers • #1 machine on Top500 today over 1000x more powerful than all combined entries on the first list in 1993 Only 18 years since the release of Mosaic! EGU, Vienna, May 3, 2010

  5. vt100 in the 1980s and alogin window on Ranger today EGU, Vienna, May 3, 2010

  6. Why are gateways worth the effort? ======= # Full path to executable executable=/users/wilkinsn/tutorial/bin/mcell # Working directory, where Condor-G will write # its output and error files on the local machine. initialdir=/users/wilkinsn/tutorial/exercise_3 # To set the working directory of the remote job, we # specify it in this globus RSL, which will be appended # to the RSL that Condor-G generates globusrsl=(directory='/users/wilkinsn/tutorial/exercise_3') # Arguments to pass to executable. arguments=nmj_recon.main.mdl # Condor-G can stage the executable transfer_executable=false # Specify the globus resource to execute the job globusscheduler=tg-login1.sdsc.teragrid.org/jobmanager-pbs # Condor has multiple universes, but Condor-G always uses globus universe=globus # Files to receive sdout and stderr. output=condor.out error=condor.err # Specify the number of copies of the job to submit to the condor queue. queue 1 • Increasing range of expertise needed to tackle the most challenging scientific problems • How many details do you want each individual scientist to need to know? • PBS, RSL, Condor • Coupling multi-scale codes • Assembling data from multiple sources • Collaboration frameworks #! /bin/sh #PBS -q dque #PBS -l nodes=1:ppn=2 #PBS -l walltime=00:02:00 #PBS -o pbs.out #PBS -e pbs.err #PBS -V cd /users/wilkinsn/tutorial/exercise_3 ../bin/mcell nmj_recon.main.mdl +( &(resourceManagerContact="tg-login1.sdsc.teragrid.org/jobmanager-pbs") (executable="/users/birnbaum/tutorial/bin/mcell") (arguments=nmj_recon.main.mdl) (count=128) (hostCount=10) (maxtime=2) (directory="/users/birnbaum/tutorial/exercise_3") (stdout="/users/birnbaum/tutorial/exercise_3/globus.out") (stderr="/users/birnbaum/tutorial/exercise_3/globus.err") ) EGU, Vienna, May 3, 2010

  7. Today, there are approximately 35 gateways using the TeraGrid EGU, Vienna, May 3, 2010

  8. Linked Environments for Atmospheric Discovery (LEAD) • Providing tools that are needed to make accurate • predictions of tornados and hurricanes • Meteorological data • Forecast models • Analysis and visualization tools • Data exploration and Grid workflow EGU, Vienna, May 3, 2010

  9. Highlights: LEAD Inspires StudentsAdvanced capabilities regardless of location • A student gets excited about what he was able to do with LEAD • “Dr. Sikora:Attached is a display of 2-m T and wind depicting the WRF's interpretation of the coastal front on 14 February 2007. It's interesting that I found an example using IDV that parallels our discussion of mesoscale boundaries in class. It illustrates very nicely the transition to a coastal low and the strong baroclinic zone with a location very similar to Markowski's depiction. I created this image in IDV after running a 5-km WRF run (initialized with NAM output) via the LEAD Portal. This simple 1-level plot is just a precursor of the many capabilities IDV will eventually offer to visualize high-res WRF output. Enjoy! • Eric” (email, March 2007) EGU, Vienna, May 3, 2010

  10. Center for Multiscale Modeling of Atmospheric Processes (CMMAP) • Improving the representation of cloud processes in climate models • Multi-scale modeling framework (MMF) represents cloud processes on their native scales, includes cloud-scale interactions in the simulated physical and chemical processes • Gateway in development, but will include simulation on both NSF and DOE resources as well as a digital library • Results can be evaluated by comparison of simulated and observed cloud-scale processes EGU, Vienna, May 3, 2010

  11. Community Climate System Model (CCSM) • Makes a world-leading, fully coupled climate model easier to use and available to a wide audience • Compose, configure, and submit CCSM simulations to the TeraGrid EGU, Vienna, May 3, 2010

  12. The Earth System Grid (ESG) • Integrates supercomputers with large-scale data and analysis servers to create a powerful environment for next generation climate research • Results of long-running, high-resolution (read expensive) simulations made available to national researchers • Key piece of infrastructure • 12,000 registered users and over 250 TB of data collections EGU, Vienna, May 3, 2010

  13. This year in TeraGridTwo great tastes that taste great togetherEnvironmental Science Gateway • Built on ESG and CCSM • Long term vision • Semantically enabled environment that includes modeling, simulated and observed data holdings, and visualization and analysis for climate as well as related domains • Short term objectives (by July, 2010) • Extend the Earth System Grid-Curator (ESGC) Science Gateway so that CCSM runs can be initiated on the TeraGrid • Integrate data publishing and wide-area transport capabilities such that model run datasets and metadata may be published back into ESG, from both the ESGC as well as CCSM Portal • Investigate interface strategies that allow ESG to federate with Purdue’s climate model archives, such that Purdue holdings become visible and accessible as part of the ESG collections EGU, Vienna, May 3, 2010

  14. PolarGrid brings CI to polar ice sheet measurement • Cyberinfrastructure Center for Polar Science (CICPS) • Experts in polar science, remote sensing and cyberinfrastructure • Indiana, ECSU, CReSIS • Satellite observations show disintegration of ice shelves in West Antarctica and speed-up of several glaciers in southern Greenland • Most existing ice sheet models, including those used by IPCC cannot explain the rapid changes EGU, Vienna, May 3, 2010

  15. Components of PolarGrid • Expedition grid consisting of ruggedized laptops in a field grid linked to a low power multi-core base camp cluster • Prototype and two production expedition grids feed into a 17 Teraflops system at Indiana University and Elizabeth City State (ECSU) split between research, education and training. • Gives ECSU (a minority serving institution) a top-ranked 5 Teraflop high performance computing system • Access to expensive data • TeraGrid resources for analysis • Large level 0 and level 1 data sets require once and done processing and storage • Filters applied to level 1 data by users in real time via the web • Student involvement EGU, Vienna, May 3, 2010 Source: Geoffrey Fox

  16. Southern California Earthquake Consortium (SCEC) Gateway used to produce realistic hazard map • Probabilistic Seismic Hazard Analysis (PSHA) map for California • Created from Earthquake Rupture Forecasts (ERC) • ~7000 ruptures can have 415,000 variations • Warm colors indicate regions with a high probability of experiencing strong ground motion in the next 50 years • Ground motion calculated using full 3-D waveform modeling for improved accuracy • Results in significant CPU use EGU, Vienna, May 3, 2010

  17. SCEC: Why a gateway? • Calculations need to be done for each of the hundreds of thousands of rupture variations • SCEC has developed the “CyberShake computational platform” • Hardware, software and people which combine to produce a useful scientific result • For each site of interest - two large-scale MPI calculations and hundreds of thousands of independent post-processing jobs with significant data generation • Jobs aggregated to appear as a single job to the TeraGrid • Workflow throughput optimizations and use of SCEC’s gateway “platform” reduced time to solution by a factor of three • Computationally-intensive tasks, plus the need for reduced time to solution is a priority make TeraGrid a good fit Source: S. Callahan et.al. “Reducing Time-to-Solution Using Distributed High-Throughput Mega-Workflows – Experiences from SCEC CyberShake”. EGU, Vienna, May 3, 2010

  18. Future Technical Areas • Web technologies change fast • Must be able to adapt quickly • Gateways and gadgets • Gateway components incorporated into any social networking page • 75% of 18 to 24 year-olds have social networking websites • iPhone apps? • Web 3.0 • Beyond social networking and sharing content • Standards and querying interfaces to programmatically share data across sites • Resource Description Framework (RDF), SPARQL EGU, Vienna, May 3, 2010

  19. Gateways can further investments in other projects • Increase access • To instruments, expensive data collections • Increase capabilities • To analyze data • Improve workforce development • Can prepare students to function in today’s cross-disciplinary world • Increase outreach • Increase public awareness • Public sees value in investments in large facilities • Pew 2006 study indicates that half of all internet users have been to a site specializing in science • Those who seek out science information on the internet are more likely to believe that scientific pursuits have a positive impact on society EGU, Vienna, May 3, 2010

  20. But, sustained funding is a problem • Gateways can be used for the most challenging problems, but • Scientists won’t rely on something that they are not confident will be around for the duration • We see this with software, but even more so with gateway infrastructure • A sustained gateway program can • Reduce duplication of effort • Sporadic development with many small programs • Increase diversity of end users • Increase skill set diversity of developers • Bring together teams to address the toughest problems EGU, Vienna, May 3, 2010

  21. Thank you for your attention!Questions? Nancy Wilkins-Diehr, wilkinsn@sdsc.edu www.teragrid.org EGU, Vienna, May 3, 2010

More Related