1 / 41

Advanced computing for fusion simulation and modelling Pär Strand

Advanced computing for fusion simulation and modelling Pär Strand Chalmers University of Technology EU Fusion for ITER Applications - EUFORIA. 5 th EGEE USER FORUM 2010-04-14 Uppsala. Overview. Brief Background – Fusion Modelling Challenges - Use Cases.

Download Presentation

Advanced computing for fusion simulation and modelling Pär Strand

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Advanced computing for fusion simulation and modelling • Pär Strand • Chalmers University of Technology • EU Fusion for ITER Applications - EUFORIA 5th EGEE USER FORUM 2010-04-14 Uppsala

  2. Overview • Brief Background – Fusion • Modelling Challenges - Use Cases. • EUFORIA approach to supporting modellers • Summary

  3. Fusion • Energy source for the sun and other stars • Provides a potential source of base load energy production • Been working on this for more than 50 years • Has turned out to be a very difficult problem "Every time you look up at the sky, every one of those points of light is a reminder that fusion power is extractable from hydrogen and other light elements, and it is an everyday reality throughout the Milky Way Galaxy." --- Carl Sagan, Spitzer Lecture, October 1991

  4. Fusion • Two main lines of research • Inertial confinement • Implosion of small pellets • NIF at LLNL • Magnetic confinement • Two main type of confugrations studied: • Stellarator – W7X • Currently under construction in Greifswald in Germany • Tokamak – ITER • Under construction in Cadarache in France

  5. ITER

  6. ITER

  7. ITER

  8. ITER

  9. ITER has developed from a long sequence of experiments BUT is the first tokamak where modelling is set to play an important role in both design and exploitation. This is even more true for the next stage - a first DEMO reactor.

  10. The “fast track” to fusion energy – hinges on several parallel developments “Numerical Tokamak” – a comprehensive simulation package with predictive capabilities.

  11. Relation to Experiments Modelling Platform User Gateway Experiment Physics analysis and exploration tools ”Semi-open” Modules Applications Workflows Operational core tools ”In House” Grid Device Independent HPC Exchange of Modules Applications Universal Access Layer, exp2ITM Local Cluster Local Data Local Data layer Data exchanges RequiresMachinedescriptions and Data mappings

  12. Modelling Challenges • Fusion codes comes with a variety of different computational requirements: • Local calculations (serial or small clusters) • Capacity computing • MC, Ray tracing, parameter scans • Intermediate level grid parallel applications (~128cores) • Capability computing (pushing the limits of existing HPC) • No single “best” computing paradigm • A single application may require different architectures/ or infrastructures depending on particular use! • Plasmas are complex and non-linear and dynamic responses may force higher level fidelity in modelling – on the fly change of application and computing infrastructure requirements • Physics are coupled • Rare situation with a single physics codes suffices for deep understanding - coupled applications a necessity!

  13. EUFORIA scope is ”edge and core turbulence and transport” Simulations 1d 2d Real problem is 3d space, 2/3d velocity

  14. Models describing the plasma vary in complexity -Varying dimensionality -Extreme ranges in both time and space scales

  15. Models describing the plasma vary in complexity EUFORIA PHYSICS

  16. Models describing the plasma vary in complexity Turbulence simulation 3d space (can also be kinetic) Edge transport simulation 2d space

  17. Developing a new paradigm for fusion computing Scientific Workflow Visualization GRID HPC • - Building on e-infrastructure tools, middleware and installations • - Integrating tools and physics models together with a ”fusion simulation ontology” • - At least initially building on fusion de facto standards for data access and communication

  18. EUFORIA 14 member Institutes 522pms covering - Management - Training • Dissemination • Grid and HPC infra- structure & support - Code adaptation & optimization • Workflows • Visualization

  19. Work plan outline Standardization and integration Jan 2008 Grid Appliance Grid Grid Testbed Deployment Mixed Workflows Fusion Dec 2010 Application Porting HPC Migrating Proof of concept runs Development and deployment

  20. Grid Infrastructure • Centers with infrastructure in EUFORIA • IFCA in Santander, Spain • KIT in Karlsruhe, Germany • Chalmers University, Sweden • Ciemat in Trujillo and Madrid, Spain • Current Status: ~3100 CPUs, ~2TB online #CPU Free Total Jobs Running Waiting ComputingElement ---------------------------------------------------------- 640 137 2027 197 1830 svea-gl2.c3se.chalmers.se:2119/jobmanager-pbs-svea 568 268 0 0 0 iwrce2.fzk.de:2119/jobmanager-lcgpbs-i2gpar 228 139 0 0 0 ce01-tic.ciemat.es:2119/jobmanager-lcgpbs-euforia 1616 1616 0 0 0 gridce01.ifca.es:2119/jobmanager-sge-euforia 48 48 0 0 0 ce-euforia.ceta-ciemat.es:2119/jobmanager-lcgpbs-euforia Avail Space(Kb) Used Space(Kb) Type SEs ---------------------------------------------------------- 1000 n.a n.a storm.ifca.es 1000 n.a n.a storm.ifca.es 114395396 100297660 n.a griddpm01.ifca.es 114395091 100297964 n.a griddpm01.ifca.es 1820000000 21058536 n.a iwrse2.fzk.de 13488960 5528882 n.a svea-gl3.c3se.chalmers.se EUFORIA grid supporting parallel apps and Interactive grid (i2g) extensions. Also shared resources with Fusion VO

  21. HECToR(EPCC, UK) –Cray XT4 (integrated with X2 vector system in a single machine). XT4: 11,328 cores. Processor: 2.8GHz Opteron; 33.2TB RAM. 59Tflops theoretical peak. X2: 112 vector processors; 2.87Tflops theoretical peak–No. 29 in Top 500 (54.65 TflopsLINPACK) Louhi(CSC, Finland)–Cray XT4. 4,048 cores. 4.5TB RAM. 37.68Tflops peak.–No. 70 in Top 500 (26.80 TflopsLINPACK) MareNostrum(BSC, Spain) –IBM Cluster. 10240 cores. Processor: 2.3GHz PPC 970; 20TB. 94.21Tflops peak. –No.26 in Top 500 (63.83 TflopsLINPACK) With DESA ~3Mcpuhours 2009

  22. Example Fusion Workflow EQUIL Scientific Workflow ICRH ECRH NEUTRALS NEO TRANSP ELM(t) SOURCE_COMBINER TRANSPORT_COMBINER ETS Visualization GRID HPC CORE2EQ SAW TEETH EQUIL? ELM(c) • One example of a workflow • Many variants are possible • Different modules have very different computational requirements • sub-second/iteration on a single cpu  hour(s)/iteration on a large number of cpus • Need to visualize the results

  23. Coupling codes and applications Adapted from David De Roure N modules integrated in N different applications N modules coupled into a dynamic application framework Number of coupled codes/physics modules in typical fusion applications can be large: - Chain1 model at JET ~85 coupled applications in a DAG like structure - Detailed predicitve transport modelling – 25 different modules with multiple versions and iterative dependencies Still need to support single standalone applications!

  24. (Subset of) codes adapted for Grid use • Different code domains and different parallel strategies: • GEM: Linear & non linear Turbulence • Gyrofluid code (Core Transport). MPI. • BIT1: Divertor code • (SOL Transport). Parameter Scan • EIRENE: Neutral transport for tokamaks & Stellarators (Neutral Transport). MC Code. • DAB (Distributed Asinchronous Bees): Tool for the optimization of any concept in fusion devices. Asynchronous Algorithm. • Plus the previously gridified Codes (Taken from EGEE code Platform): Suitable for Workflows. • - ISDEP (MC-Transport), Mishka & Helena (Equilibrium and MHD), VMEC (3D-Equilibrium suitable for tokamaks and sellarators).

  25. Enhancing the grid usage • Application specific support tools for enhancing the grid usage. • TAPAS(Tool for Automatic Parameter Studies) and TAPAS4GRID : allows parameter scans automatically on HPCs and grids. Used for EMC3-EIRENE and BIT1. • S.t.a.r.t. (Simple Tool for Application Runs and Transfer of data) provides a command line interface that hides all the complexity of job management in the grid. It also simplifies the inclusion of required libraries for the submitted jobs. • Tools are also building block for improved inclusion of applications in Workflows

  26. Generic Application Structure Large scale fusion applications tend to be long lived: - How to build software that lasts 30+ years?

  27. European Fusion Implementation Modelling tools are either standalone applications OR modules .i.e., subroutines. Automagic tools wrap modules into workflow components (Kepler Actors)

  28. Scientific Workflow Visualization GRID HPC KEPLER Transport Workflow P.HUYNH, V.BASIUK

  29. KEPLER Transport Workflow Scientific Workflow Visualization GRID HPC End user only care about available resources, suitability to problem and performance, Particular details of the underpinning hw/sw configurations is in general not of interest:  Hide or at least simplify complexity and promote usability from the end user point of view. Lesson learned: Users much more forgiving towards HPC access complexity than they appear to be towards grid access protocols uesability thresholds. P.HUYNH, V.BASIUK

  30. The EUFORIA Services in the broader view GRID and HPC infrastructures EUFORIA EGEE DEISA

  31. Launching simple Grid jobs Composite actor

  32. Example of composite actor - details Certificates Get output Job status Job submit Allows for arbitrary complex workflows to be built!

  33. GRID workflow Kepler on the local cluster launches a simple Kepler workflow on GRID: Gateway GRID RAS on euforia.efda-itm.eu

  34. Workflow + fusion codes on GRID and HPC • MHD equilibrium codes • Helena on GRID • ILSA on HPC DEMO D10 TODAY EUFORIA-EGEE-DEISA Collaboration

  35. Data Management • Data model (CPOs) not directly supported by Kepler workflow • Need to manage data outside WF  Universal Access Layer • Built on fusion de facto standards “MDSplus” for transport and storage • Supports in memory, file (HDF5) and database (MDSplus) access • Challenges • Larger datasets (order Gb’s and beyond) • Efficiency in distributed environments? • Access model inherits concept of centralized data server (s) and cataloguing. • UAL • Extensible user API built on the CPO (“standardized data structures”) • A transport layer support local and remote access • A set of “backend” storage option:sMdsplus, HDF5, (memory cache), ...!

  36. ITER – an International Collaboration “ITER aim is to demonstrate that it is possible to produce commercial energy from fusion.” • Experimental facility: • -10Gbit/s during discharges, 500-1000s • 20PB/year (lower bound estimates) • International partners: • - Data replication – at least two offsite rep. • - (Near) real time data streaming, inline • modelling data to remote centers • - “Semi” remote operation • - Middleware interoperability •  agreement on single technology (most interfaces will be centrally managed/decided!) • - Resource sharing/policies • - IPR challenging issue. • Nuclear installation: • - Security • - licensing • First plasma 2019, full operation 2025 (!) • ~3000-4000 remote participants

  37. Scientific Workflow Visualization GRID HPC Towards fully 3D CFD: The EMC3-EIRENE code (IPP Greifswald – FZ-Juelich) Frerichs, H., Reiter, D. et al., Comm. Phys. Commun. (2010) 181 61-70 and: Nuclear Fusion (2010) 50, in print • (initially developed for stellarator applications W7AS, W7X, LHD) was advanced to a more flexible grid structure to allow divertor tokamak + RMP applications. • first self-consistent 3D plasma and neutral gas transport simulations for poloidal divertor tokamak configurations with RMPs. • Simulation results for ITER similar shape plasmas at DIII-D show a strong 3D spatial modulation of plasma parameter, e.g. in Te. • EMC3-EIRENE code verification • (by benchmarks with 2D tokamak edge codes) and validation (TEXTOR, DIII-D, JET, LHD experiments) ongoing • EMC3-EIRENE is currently foreseen for contractual ITER RMP design studies (jointly by FZJ and IPP, 2010…) Electron Temperature, DIII-D, with RMPs Te,1 = 60 eV Te,2 = 120 eV Te,3 = 200 eV

  38. ASCOT calculations of ITER Scientific Workflow Visualization GRID HPC • The ASCOT group used the EUFORIA- DEISA resources for simulating fusion alpha particles in ITER with two computing time - intensive upgrades. • an experiment-based anomalous diffusion profile • full orbit integration near the wall for precision wall hits • The goal was to provide insight on the optimal location of Fast Ion Loss Detectors in ITER, which is still under discussion. 3D power distribution on the wall under anomalous diffusion Tuomas Koskela

  39. Fusion has turned out to be somewhat more complex to master than initially believed Contrary to popular beliefs - we ARE making rapid progress!

  40. Summary • The fusion community is benefitting from e-infrastructures and using input from EUFORIA and collaborating project to do better science! • Improvements and porting of single codes • GRID • HPC • Using workflows • Using visualization tools • Running on the infrastructure provided by • GRID –EUFORIA VO sustainable into EGI • DEISA and Project Partners • Training and outreach

  41. Chalmers University of Technology (Coordinator) from Sweden • Max Plank Institute for Plasma Physics (IPP) from Germany • Centro Superior de Investigaciones Científicas (CSIC) from Spain • Centro de Investigaciones Energéticas, Medio Ambientales y Tecnológicas (CIEMAT) from Spain • Forschungszentrum Karlsruhe (FZK) from Germany • Finnish IT Center for Science (CSC) from Finland • Abo Akademi University (ABO) from Finland • University of Edinburgh (UEDIN) from United Kingdom • Barcelona Supercomputing Center (BSC) from Spain • French Atomic Energy Commission (CEA) from France • University of Strasbourg from France • University of Ljubljana (UOL) from Slovenia • Poznan Supercomputing and Networking Center PSNC from Poland • Italian National Agency for New Technologies, Energy and the Environment (ENEA) from Italy EGEE DEISA • Thanks

More Related