1 / 16

HPC in France and Europe Overview of GENCI and PRACE

HPC in France and Europe Overview of GENCI and PRACE. Stéphane REQUENA, CTO GENCI. Supercomputing - driving Science and Industry through simulation. Aerospace. Materials / Inf. Tech Spintronics Nano-science. Ageing Society Medicine Biology. Energy Plasma Physics Fuel Cells.

asasia
Download Presentation

HPC in France and Europe Overview of GENCI and PRACE

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. HPC in France and Europe Overview of GENCI and PRACE Stéphane REQUENA, CTO GENCI

  2. Supercomputing - driving Science and Industry through simulation Aerospace Materials / Inf. Tech Spintronics Nano-science Ageing Society Medicine Biology Energy Plasma Physics Fuel Cells Environment Weather / Climatology Pollution / Ozone Hole Virtual power plant Automotive Multimedia Finance Franco-British Workshop on Big Data in Science

  3. HPC isa «keytechnology» A 200M$ effort by 6 agencies • Supercomputers : an indispensable tool to solve the most challenging problems via simulations • Access to world class computers : essential to be competitive in science and engineering • Providing competitive HPC services : a continuous endeavor • This has been acknowledged by leading industrial nations → Europe : PRACE → France : GENCI Franco-British Workshop on Big Data in Science

  4. GENCI Grand Equipement National de Calcul Intensif • Missions : • To implement a national HPC strategy in France and to provide the 3 national HPC academic centres with supercomputers • To contribute to the creation of the European HPC ecosystem • To promote numerical simulation and HPC in academiaand industry 06/11/12 Franco-British Workshop on Big Data in Science

  5. GENCI : powering the 3 national HPC centres Bruyères-le-Châtel • Coordination and optimization of investments in HPC • Common allocation of HPC hours via call for proposals TGCC Orsay IDRIS • A 80x increase in 5 years Montpellier CINES Resourcesdividedintoscientific areas Franco-British Workshop on Big Data in Science

  6. A huge effort for increasingFrench HPC capacities Franco-British Workshop on Big Data in Science

  7. PRACE: a European ResearchInfrastructure (RI) & ESFRI list-item • PRACE RI is in operation since April 2010 • PRACE AISBL created with 20 countries, head office in Brussels • Now 25 member countries • PRACE RI is providing services since august 2010 • Now 6 Tier0 systems available • 4.3 billions core hours awarded to 159 projects through a singlepan-European peer review process • Funding secured for 2010-2015 • 400 Million€ from France, Germany, Spain and Italy, provided as Tier0 services on TCO basis • 130 Million€ additional funding = 70 Million€ from EC FP7 preparatory and implementation projects + 60 Million€ from PRACE members : • Technical, organizational and legal support for PRACE • Prepared the creation of the AISBL as a legal entity • Established the PRACE brand • Provided extensive HPC Training • Deployed and evaluated promising architectures • Ported and petascaledapplications 1st Council June 9, 2010 PRACE-3IP kick-off in Paris 06/11/12 Franco-British Workshop on Big Data in Science

  8. 2012: PRACE is providing nearly 15 PFlop/s... JUQUEEN: IBM BlueGene/Q at GCS partner FZJ (Forschungszentrum Jülich) Mare Nostrum: IBMat BSC FERMI: IBM BlueGene/Q at CINECA CURIE: Bull Bullx at GENCI partner CEA. HERMIT: Cray at GCS partner HLRS (High Performance Computing Center Stuttgart). SuperMUC: IBM at GCS partner LRZ (Leibniz-Rechenzentrum) 06/11/12 Franco-British Workshop on Big Data in Science

  9. to face the tempest!The UPSCALE project aims to continue developing our climate modelling capability and goes for even higher global resolution, all the way to 12km, which is not even envisioned for Met Office global weather forecasting before 2015. PRACE boosts Science AWARD : 144M CPU HOURS Credits: Prog. Pier Luigi Vidale, Univ. Reading, U.K. -Cray XE6 System Hermit in GCS@HLRSalso in NATURE Climate Change July 2012 PRACE-1IP Kick-off meeting in 9

  10. CURIE : the French PRACE supercomputer In honour of Marie Curie Global peak performance of 2 PFlop/s > 92 000 Intel cores, 360 TB memory, 15 PB Lustre @ 250 GB/s, 120 racks, < 200 m2 - 2,5 MW 50 kms of cables • CURIE, France’s commitment to PRACE, is overseen by GENCI • Located in and operated by CEA DAM teams • A modular and balanced architecture by • Cluster of SMP nodes with fat, thin and hybrid nodes • Complementary to other PRACE Tier0 systems • Fully available since March 8, 2012 06/11/12 Franco-British Workshop on Big Data in Science

  11. Example of recentresults on CURIE Understandingthe evolution of the Universe (1/2) Franco-British Workshop on Big Data in Science • Grand challenge conducted by Observatoire de Paris and the DEUS Consortium (http://www.deus-consortium.org) • Goal : perform 3 FULL Universe simulations, fromBig Bang to nowdaysusing 3 differentdarkenergy distributions • Influence of the darkmatterwrtevolution of the Universe • Direclylinkedwith the 2011 Physics Nobel Prize • Data willbeused to feednext EU EUCLID telescope • Unprecedented HPC requirements • >550 billions particles, 81923mesh in a 21 h-1Gpc box • RAMSES code (CEA) and a dedicatedworkflowtoolchain • 76k cores, >300 TB of main memory • Specificmemory, MPI and parallel I/O optimisations

  12. Example of recentresults on CURIE Understandingthe evolution of the Universe(2/2) Franco-British Workshop on Big Data in Science • WORLWIDE record finished 2 monthsago • First FULL UniverseΛCDM, LCDM and RPCDM simulations performed on Curie thinnodes • 3runs for a total of 92 hourselapsedon 76 032 cores, last runlasted 29 hourswithoutanyfailure WWOOUUAAHHH CURIE isvery stable ! • A strongneed of substained I/O rate on the Lustre scratch fs • We have herea BigDataproblem • A total of 10 PB of full data (scratch and rough data) generated • 4 PB of rough resultsafter simulation • 1.2 PB of refined data after post processing for the 3 darkenergy simulations need to be made available to worldwidescientists !

  13. Explosion of computational dataAn anotherexamplefromclimatology Franco-British Workshop on Big Data in Science • Evolution of the global climate • 5th IPCC campaign , French production on a dedicated NEC SX9 : > 1TB/day • Strong issues with storage, post processing and archive of data • And the future is :

  14. One conclusion Data isexploding Franco-British Workshop on Big Data in Science • Observational/experimental data • Particle accelerators and detectors (LHC@CERN) • Genome sequencer and personalized medecine • Next gen satellites and (radio)telescopes • Captors/sensors in weather forecast/climatology or oil & gas • Finance, insurance, … • Computational data • Increase on HPC resources (PRACE = 15 PF in 2012) • Increase of space and time resolution of models • Multi-physics and multi scale simulations • Rise of uncertainties quantification, ensemble simulations, … • With problems relative to • Size of data (number and size of files), (un)structured data, format, ..) • Uncertainties of data and fault tolerance • Metadata issue • Post processing (20% of time) • Dissemination of refined data to worldwide communities during decades That means : To deploy PERENE and SUBSTAINABLE Research Infrastructures

  15. Another conclusion People isaware about that ! Franco-British Workshop on Big Data in Science • On the hardware and system software side • Multi level storage with new I/O devices : mix of flash based memory (SSD, PCM, …) with hard drives • Asynchronous I/O and Active I/O (servers embedded into I/O controllers) • Next generation of parallel file system (Lustre, GPFS, Xyratec, …) • Flops will be “almost free” -> post processing at the same time as computation • A lot of European projects and R&D initiative • PRACE implementation projects : data management, remote viz, portals, … • EUDAT : data services between computing/data centers and end users communities • EESI2 : cartography about Exascale R&D efforts • French INRIA Blobseer R&D project, … • But a lot of applications will need to be rewritten/adapted • The complete I/O strategy need to be re thinked • New methods for data analysis/exploration are needed (MapReduce, Hadoop, NOSQL, …?) • Rough data will stay in computing/data center and ONLY refined data will go out • Networks bandwidth will need to increase • Use of remote visualisation

  16. HPC : en route to international synergies • Global European HPC ecosystem integration Franco-British Workshop on Big Data in Science HPC enables scientific discoveries and innovation for both research and industry Wemustn’tfollow the trends. We must anticipate them ! • To face future societal or industrial challenges • To prepare users for future parallel architectures and applications • To increase involvement of scientists or engineers in these techniques

More Related