1 / 18

Ioannis Liabotis Project Technical Coordinator GRNET iliaboti at grnet dot gr

High-Performance Computing Infrastructure for South East Europe ’ s Research Communities 2 nd HellasHPC Workshop, Athens, October 2010. Ioannis Liabotis Project Technical Coordinator GRNET iliaboti at grnet dot gr. HP-SEE. Contract n°: RI-261499 Project type: CP & CSA

jerome
Download Presentation

Ioannis Liabotis Project Technical Coordinator GRNET iliaboti at grnet dot gr

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. High-Performance Computing Infrastructure for South East Europe’s Research Communities 2nd HellasHPC Workshop, Athens, October 2010 Ioannis Liabotis Project Technical Coordinator GRNET iliaboti at grnet dot gr The HP-SEE initiative is co-funded by the European Commission under the FP7 Research Infrastructures contract no. 261499

  2. HP-SEE • Contract n°: RI-261499 • Project type: CP & CSA • Call: INFRA-2010-1.2.3: VRCs • Start date: 01/09/2010 • Duration: 24 months • Total budget: 3 885 196 € • Funding from the EC: 2 100 000 € • Total funded effort, PMs: 539.5 • Web site: www.hp-see.eu HellasHPC 2nd Workshop, Athens, 22 Oct 2010 2

  3. HP-SEE Partnership Contractors (14) Third Party / JRU mechanism used associate universities / research centres HellasHPC 2nd Workshop, Athens, 22 Oct 2010 3

  4. Context: the Timeline HellasHPC 2nd Workshop, Athens, 22 Oct 2010 4

  5. SEE eInfrastructure activities – past 6 years SEEREN1/2: regional inter-NREN connectivity and GEANT links [DGINFSO] BSI: Southern Caucasus links [DGINFSO] SEELIGHT: lambda facility in SEE [Greek HiperB] Result: sustainable national & regional networks, most countries in GEANT SEEGRID1/2: regional Grid infrastructure, building NGIs and user communities SEE-GRID-SCI: eInfrastructure for large-scale environmental science user communities: meteorology, seismology, environmental protection. Inclusion of Caucasus. [DGINFSO] Result: sustainable national Grids, all countries within European Grid Initiative HP-SEE: regional HPC interconnection and 2nd generation Caucasus link Expected result: sustainable national HPC centers, long-term sustainable (hierarchical) model in collaboration with PRACE and DEISA SEERA-EI: regional programme managers collaboration towards common eInfrastructure vision, strategy and regional funds [DGRTD] Result: ensuring long-term national-level funds and regional funds to complement EC funds HellasHPC 2nd Workshop, Athens, 22 Oct 2010 5

  6. Context: the Model:Converged communication & service infrastructure for South-East Europe Seismology, Meteorology, Environment Comp physics, Comp chem, Life sciences User / Knowledge layer HP-SEE SEE-GRID & EGI SEE-LIGHT & BSI & GEANT HellasHPC 2nd Workshop, Athens, 22 Oct 2010 6

  7. HP-SEE Project Objectives • Objective 1 – Empowering multi-disciplinary virtual research communities • Objective 2 – Deploying integrated infrastructure for virtual research communities • Including a GEANT link to Southern Caucasus • Objective 3 – Policy development and stimulating regional inclusion in pan-European HPC trends • Objective 4 – Strengthening the regional and national human network HellasHPC 2nd Workshop, Athens, 22 Oct 2010 7

  8. Work Organization - PERT HellasHPC 2nd Workshop, Athens, 22 Oct 2010 8

  9. Existing infrastructure – Blue Gene/P • IBM Blue Gene/P –two racks, 2048 PowerPC 450processors (32 bits, 850 MHz), a total of 8192 cores; • Double-precision, dual pipe floating-point acceleration on each core; • A total of 4 TB random access memory; • 16 I/O nodes currently connected via fibre optics to 10 Gb/s Ethernet switch; • Theoretical peak performance: Rpeak= 27.85 Tflops; • Energy efficiency: 371.67 MFlops/W: Green top 10 • Application software: • GotoBLAS • GAMMES • NAMD • CPMD • LAMMPS • GROMACS • OpenAtom HellasHPC 2nd Workshop, Athens, 22 Oct 2010 9

  10. Existinginfrastructure – BGHPC Cluster at IICT-BAS • HP Cluster Platform Express 7000 enclosures with 36 blades BL 280c with dual Intel Xeon X5560 @ 2.8Ghz (total 576 cores), 24 GB RAM per blade • 8 controlling nodes HP DL 380 G6 with dual Intel X5560 @ 2.8 Ghz, 32 GB RAM • Non-blocking DDR Interconnection via Voltaire Grid director 2004 • Two SAN switches for redundant access • MSA2312fc with 48 TB storage, Lustre filesystem • More than 92% efficiency on LINPACK (>3 TFlops, peak performance 3.2TFlops) • SL 5, gLite, torque+maui, eucalyptus walrus • Software: MPICH, MPICH2,OPENMPI, • Maple, GROMACS, FFTW, Elmer, BLAS, LAPACK • Middleware of choice for HPC: openssh HellasHPC 2nd Workshop, Athens, 22 Oct 2010 10

  11. HPC infrastructure – HU • SUN Fire supercomputer cluster run by the National Information Infrastructure Development Institute (NIIF); • The resource currently consists of two SUN Fire (SMP) machines totaling to 216 cores providing 0.6 Tflop/s computational power as well as 20 Tbytes of primary and 40 Tbytes of secondary storage facilities in total. • Planned: • NIIF has managed to establish funding to improve the current compute power and to increase the data storage capacity up to 30 Tflop/s and 0.5 Pbyte. • The current central SMP infrastructure will also have been gradually upgraded. HellasHPC 2nd Workshop, Athens, 22 Oct 2010 11

  12. HPC infrastructure – RO • The computing centers at IFIN-HH and UPB: • IBM BladeCenter Cluster at IFIN-HH, with an HPL benchmark Rpeak of 4 TFLOPS, contains both IBM PowerXCell 8i and AMD Opteron 2356 processors, counting a total of 368 cores and 592 GB RAM, using Infiniband 4X 10Gbps technology; • The Biocomputing cluster, with Rpeak=2,7 TFLOPS, is based on Intel Xeon E5430 (Quad-Core) processors and Myrinet 2000 2Gbps technology. • Application software: • NAMD, CHARMM, VASP, ATLAS, GotoBLAS, FFTW, GAUSSIAN, Turbomole, Matlab, etc. • Planned: • The total HPC processing power is expected to double next year. • GPU cluster at ISS will reach a processing power of 100 TFLOPS. • Ministry of Communications and Information Society recently started a project for the creation of a National Center of Supercomputing. The center will host, beginning with 2010, a BlueGene supercomputer. HellasHPC 2nd Workshop, Athens, 22 Oct 2010 12

  13. Planned infrastructure • Serbia joined the PRACE initiative in December 2008, represented by IPB. Institute of Physics Belgrade (IPB) as a coordinating institution for all HPC activities. • IPB has focused HPC activities on establishment of the Blue Danube National Supercomputing and Data Storage Facility, in collaboration with the Ministry of Science and Technological Development of the Republic of Serbia. • Serbian government will invest 10 million Euro in a 7-year period. The hardware is planned to be purchased in several phases. • In Greece, GRNET is currently coordinating the nascent Greek HPC initiative, and pilot project for HPC user communities’ assessment in the country. • The overall goal is to acquire a machine in the range of 100 TFlops Rmax performance according to the Linpack benchmark. • The detailed specification of the supercomputer will derive from the particular requirements from the Greek user community that will benefit from this investment. HellasHPC 2nd Workshop, Athens, 22 Oct 2010 13

  14. Introduction to VRCs • Comp. Physics 6 countries, 8 apps. • Comp. Chemistry 6 countries, 7 apps. • Life Sciences 5 countries, 7 apps. HellasHPC 2nd Workshop, Athens, 22 Oct 2010 14

  15. Description of Computational Physics applications HellasHPC 2nd Workshop, Athens, 22 Oct 2010 15

  16. Description of Computational Chemistry applications HellasHPC 2nd Workshop, Athens, 22 Oct 2010 16

  17. Description of Life Sciences applications HellasHPC 2nd Workshop, Athens, 22 Oct 2010 17

  18. Long-term vision… Being on the technological par with the rest of Europe Enabling local scientists to use their potential Role-model for regional developments Leading the way in wider contexts HellasHPC 2nd Workshop, Athens, 22 Oct 2010 18

More Related