1 / 11

Preparing Your Application for TeraGrid Beyond 2010

Preparing Your Application for TeraGrid Beyond 2010. TG09 Tutorial June 22, 2009. Instructors. Lonnie Crosby, NICS: lcrosby1@utk.edu Mark Fahey, NICS: mfahey@utk.edu Anirban Jana, PSC: anirban@psc.edu Byoung-Do Kim, TACC: bdkim@tacc.utexas.edu Lars Koesterke, TACC: lars@tacc.utexas.edu

arin
Download Presentation

Preparing Your Application for TeraGrid Beyond 2010

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Preparing Your Application for TeraGrid Beyond 2010 TG09 Tutorial June 22, 2009

  2. Instructors • Lonnie Crosby, NICS: lcrosby1@utk.edu • Mark Fahey, NICS: mfahey@utk.edu • Anirban Jana, PSC: anirban@psc.edu • Byoung-Do Kim, TACC: bdkim@tacc.utexas.edu • Lars Koesterke, TACC: lars@tacc.utexas.edu • Amitava Majumdar, SDSC: majumdar@sdsc.edu • Sergiu Sanielevici, PSC: sergiu@psc.edu • Mahidhar Tatieni, SDSC: mahidhar@sdsc.edu

  3. Objectives • Summarize what we know about the TeraGrid resources available to you past March 31, 2010 • Introduce you to the two machines that are available now, and are known to be funded by NSF beyond 2010 • Use hands-on exercises to illustrate the principles of using these two systems with MPI , OpenMP, and Hybrid parallel codes • Engage in discussions (to be continued after we all go home) about what all this means for your research project.

  4. Prerequisites • You have your own machine to work on. • You are familiar with the basics of MPI and OpenMP parallel programming. • You are familiar with the basics of building applications and submitting jobs in a batch supercomputing environment. The examples are in Fortran; C equivalents are given for selected instructions.

  5. Where is TeraGrid Going? • Current Program: Nominal end date for TeraGrid Phase II is March 2010 • TeraGrid Phase III: eXtreme Digital Resources for Science and Engineering  (XD) Follow-on program for TeraGrid, providing integrating services for the machines available to you 2011-2015. Planning studies are underway, proposals to be submitted in July 2010. • Extension proposal for the current program To span the year 4/1/2010 to 3/31/2011 Decision by NSF expected in August 2009

  6. Known, Planned and Proposed Resources for2010 • Available Now, Continuing into XD: • Ranger (TACC): Sun Constellation, 62,976 cores, 579 Tflop/s, 123 TB RAM • Kraken (NICS): Cray XT5, 66,048 cores, 608 Tflop/s, > 1 Pflop/s in 2009 • Planned for 2010, Continuing into XD: • Large Shared Memory System at PSC • Planned for 2010-2011: • Flash & Virtual Shared Memory System at SDSC • Proposed for Extension until 6/30/2010: • Pople (PSC): Shared memory system • Proposed for Extension until 3/31/2011: • Abe (NCSA): 90 Tflop/s Infiniband Cluster • Steele (Purdue): 67 Tflops/s Infiniband Cluster • Lonestar (TACC): 61 Tflop/s Infiniband Cluster • QueenBee (LONI): 51 Tflop/s Infiniband Cluster • Lincoln (NCSA): GPU-Based Cluster • Quarry (IU): Virtual service hosting

  7. New Resources expected in 2011 • Track 2d being competed • data-intensive HPC system • experimental HPC system • pool of loosely coupled, high throughput resources • experimental, high-performance grid test bed • eXtreme Digital (XD) High-Performance Remote Visualization and Data Analysis Services • service and possibly resources; up to 2 awards (?) • Blue Waters (Track 1) @ NCSA: • 1 Pflop/s sustained on qualifying applications in 2011

  8. So, what can we do to prepare? • Today, we focus on TACC Ranger and NICS Kraken • These are the largest current TeraGrid systems (~90% of July 2010 TRAC allocations) and will continue into the XD period • The principles for getting your codes to make good use of these machines will apply to most systems expected to enter TG/XD • Watch http://www.teragrid.org/XDTransition/ to keep informed of developments, including further training opportunities for Ranger, Kraken, and the other planned and proposed resources.

  9. Other useful information • Requesting a TeraGrid allocation, including advanced user support if desired: http://www.teragrid.org/userinfo/access/allocations.php • Panel Session on the XD Transition: Thursday 6/25 1:30 to 2:30 PM, Potomac 1/2

  10. Today’s Agenda • Overview of TACC Ranger • Ranger Hands-On • Overview of NICS Kraken • Kraken Hands-On • Discussion and Wrap-Up Breaks: 10:00 AM to 10:30 AM ABCD Foyer Noon to 1:00 PM Lunch in Independence A 3:00 PM to 3:30 PM ABCD Foyer Adjourn at 5 PM

  11. Questions, Please!

More Related