1 / 17

The role of ARC in scaling up the Hungarian NGI

The role of ARC in scaling up the Hungarian NGI. Péter Stefán NIIFI. NorduGrid 2011 May 9-12 Sundvolden, Norway. The Infrastructure. NIIF and Scientific Computing. 2001 – the beginning Sun E10k 60 Gflops; SMP; 96 UltraSparc processors; 48 GB memory TOP500 (#428)

Download Presentation

The role of ARC in scaling up the Hungarian NGI

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The role of ARC in scaling up the Hungarian NGI Péter Stefán NIIFI NorduGrid 2011 May 9-12 Sundvolden, Norway

  2. The Infrastructure

  3. NIIF and Scientific Computing • 2001 – the beginning • Sun E10k • 60 Gflops; SMP; 96 UltraSparc processors;48 GB memory • TOP500 (#428) • Upgrade in multiple steps (last in 2009) • Sun F15k • ~900 Gflops; 216 processor cores; 400 GB memory • 2002 – ClusterGrid – a CPU scavenging infrastructure

  4. NIIF and Scientific Computing • Utilization: ~100% • Users: ~130 researcher groups (scientific projects) • Areas: • chemistry, physics, biology, astronomy, geology, IT, mathematics, life sciences, etc. • Numerous publications, and papers • Made us a natural customers of grid solutions

  5. Infrastructure reengineering • Establish funding; ÚMFT – 4mEURO (2007-2009) • Collecting user demands, browsing technology trends (2009) • Procurement (2010) • Implementation (2011)

  6. Main objectives • Multiple computers (4) • Distributed on multiple sites (4 sites) • Different architectures (PRACE) • 2 magnitudes of performance increase

  7. Sites and Interconnection • University of Debrecen • University of Pécs • University of Szeged • NIIFI, Budapest

  8. Site Univeristy of Debrecen • SGI Altix ICE 8400EX • Clustered system • Intel Xeon (Westmere-EP) processors • 18 Tflop/s • 1536 cores (3.33 GHz) • redundant QDR Infiniband interconnect • 6 TB memory • ~500 TB storage space • Linux • watercooled racks • Nvidia Quadro FX5800 based visualisation subsystem

  9. Site University of Pécs • SGI UltraViolet 1000 (SGI UV) • ccNUMA (SMP) architecture • Intel Xeon X7542 (Nehalem EX) processors • 10.5 Tflop/s • 1152 cores • Numalink5 interconnect • 6 TB memory • ~500 TB storage space • Linux • watercooled racks • Nvidia Quadro FX5800 based visualisation subsystem

  10. Site University of Szeged • Hewlett-Packard CP4000BL • Fat-node cluster (blade) • AMD Opteron 6174 (Magny Cours) processors (12 cores/processor) • 14 Tflop/s • 2112 cores • 48 cores/node! (SMP like...) • 5,6 TB memory • redundant QDR Infiniband mesh interconnect • ~250 TB storage space • Linux • Nvidia Quadro FX5800 based visualisation subsystem

  11. Site NIIFI in Budapest • Hewlett-Packard CP4000SL • Fat-node cluster • AMD Opteron 6174 (Magny Cours) processors (12 cores/processor) • 5 Tflop/s • 768 cores • 24 cores/node • 2 TB memory • redundant QDR Infiniband mesh interconnect • ~50 TB storage space • watercooled racks • Linux

  12. The (planned) Software Stack

  13. Software Stack • Linux operating system (unfortunately mixed SuSE, RedHat) • Global FS over 4 compute nodes (Ibrix, Lustre), NFS otherwise • Parallelization libs: OpenMP, PVM, PMI • Local scheduler: SGE • Popular applications: Gaussian, Matlab, FFT, CPMD, Gromacs, RasMol, etc.

  14. Software Stack • User authentication: • Central-replicated LDAP • Grid authentication, x509, SLCS • Grid middleware – ARC: • computing service, AREX (all sites) • grid management service (all sites) • information service, ISIS (3 sites) • monitoring service, WS-Mon (1 site) • urge users to access resources via ARC client • potentially: A-Hash, Librarian, Bartender (2 grid sites), Shepherd, Hopi (1 site)

  15. Software Stack • Application portals, helps users in configuring HPC applications • Offer professional user support

  16. Summary • High quality HPC infrastructure as background • ~50 Tflops (TOP500 #165) • Necessary tool to serve national scientific computing • Important to join ERA • ARC is a crucial component of the stack

  17. Thanks + Questions!

More Related