1 / 6

P ITTSBURGH S UPERCOMPUTING C ENTER

R ESOURCES & S ERVICES Marvel 0.3 TF HP GS 1280 SMP OS: Tru64 Unix 2 nodes (128 processors) Nodes: 64 x 1.15 GHz EV67 processors, 256GB memory Archive Storage 150 TB online; 2.4 PB archive Visualization, Dissemination GIG Participation Security User Support Coordination

Download Presentation

P ITTSBURGH S UPERCOMPUTING C ENTER

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. RESOURCES &SERVICES Marvel 0.3 TF HP GS 1280 SMP OS: Tru64 Unix 2 nodes (128 processors) Nodes: 64 x 1.15 GHz EV67 processors, 256GB memory Archive Storage 150 TB online; 2.4 PB archive Visualization, Dissemination GIG Participation Security User Support Coordination Software Integration EOT Infrastructure TCS (LeMieux) 6 TF HP AlphaES45 Cluster OS: Tru64 Unix 750 nodes (3000 processors) Nodes: 4 x 1 GHz processors, 4 GB memory PITTSBURGHSUPERCOMPUTINGCENTER TERAGRIDFOCUSAREAS • Red Storm Program • 10 TF Cray XT3 • OS: Linux/Catamount • >2000 2.4 GHz AMD processors • 1 GB memory per processor Senior GIG Staffing:Ralph Roskies* Michael Levine * Sergiu Sanielevici (ESC) (RP-PI) (Area Director, User Support Coordination)

  2. SCIENCE - TERAGYROID • Project couples cutting-edge grid technologies, high-performance computing, visualization and computational steering capabilities to produce a major advance in soft condensed matter simulation • Interesting effects only for particular parameter values. Use TeraGrid and UK Reality Grid to generate many small computations, then migrate interesting ones to large systems, like PSC’s TCS. PSC contributions: Led effort to organize and execute the TeraGyroid project Enabled largest Boltzmann lattice calculation ever (10243)

  3. TECHNOLOGY End-to-end Application Performance Use PSC-developed network/host tuning capabilities (Web100, Net100) • Cray XT3 • Leadership class computing • Novel technology for extraordinary scalability • Distinguishing feature- huge inter-processor bandwidth • Lustre file system • Candidate for TeraGrid global file system

  4. SYNERGYSTIC ACTIVITIES CELLSIMULATIONS(NIH-funded) Large-scale cellular simulations Needs ability to launch, manage, and integrate results from 1000’s of independent runs MCELL- Co-author Joel Stiles, PSC

  5. TECHNOLOGY SCALING TCS “LeMieux” at PSC: Reliably Providing Thousands of Processors Per Job

  6. SCIENCE 3/4 TORNADO WARNINGS ARE FALSE ALARMS Most Realistic Tornado Simulation Ever Done Will Reduce Tornado False Alarms K. Droegemeier and M. Xue Wind speed up to 260 mph 25 meter horizontal resolution 20 meter vertical resolution 2048 TCS processors for 24 hours Generated 20 TB data

More Related