1 / 32

European Taskforce on Integrated Tokamak Modelling

Joint ORNL/Indiana University Workshop on Computational Frameworks for Fusion. European Taskforce on Integrated Tokamak Modelling. A Brief Overview Oak Ridge, June 7. Pär Strand/Bernard Guillerminet A. Becoulet, H. Wilson, M. Romanelli

marla
Download Presentation

European Taskforce on Integrated Tokamak Modelling

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Joint ORNL/Indiana University Workshop on Computational Frameworks for Fusion European Taskforce on Integrated Tokamak Modelling A Brief Overview Oak Ridge, June 7 Pär Strand/Bernard Guillerminet A. Becoulet, H. Wilson, M. Romanelli G. Huysmans, F. Porcelli, D. Coster, B. Scott, T. Hellsten and Contributors to the TF-ITM Workprogramme Task Force Leader: A. Bécoulet Deputies: M. Romanelli, P. Strand, (H. Wilson) EFDA CSU Contact Person: K. Thomsen

  2. SCOPE EFDA(03)-21/4.9.2 (June 24th, 2003) Executive summary: The aim of the task force is to co-ordinate the development of a coherent set of validated simulation tools for the purpose of benchmarking on existing tokamak experiments, with the ultimate aim of providing a comprehensive simulation package for ITER plasmas. The remit of the Task Force would extend to the development of the necessary standardized software tools for interfacing code modules and for accessing experimental data. In the medium term, this task force’s work would support the development of ITER-relevant scenarios in current experiments, while in the long term it would aim to provide a validated set of European modelling tools forITER exploitation

  3. Physics Integration: Integration of MHD, transport, exhaust, energetic particle physics, etc Need to foster interactions between different physics areas Code Integration: Creating a set of validated, benchmarked codes Standardised inputs/outputs to allow modules from different codes to be linked A standardised framework for code development and use Discipline Integration: Success of the TF relies on cross-discipline interactions, with input from: Theoreticians to build/improve the appropriate mathematical models Modellers to construct efficient, accurate codes for the models Experimentalists to provide data to validate models. Computer Scientists to provide framework tools and methods Involvement of each community will be important for the success of the TF All disciplines benefit from such an integration process what do we mean by “integrated modelling”?

  4. the objectives of the long term work plan 1) structure the EU modelling effort around ITER scenario prediction and, later, around ITER experimental analysis and physics design of new fusion devices. 2) structure the EU modelling effort around existing fusion devices. 3) address the modelling issues of fusion plasma physics which require a high degree of integration (physics integration, code integration, discipline integration), and collaboration between the EFDA partners. 4) identify the theory and modelling development needs, as well as experimental requests linked to modelling activity, and promote them among EFDA partners and EU fusion devices. 5) strengthen the collaborative modelling activity between EU and other ITER partners and promote EU modelling activity at ITER level 6) provide EU modellers with a code platform structure, easily enabling coupling between codes and models, and providing access to device geometries and databases, strengthening systematic code comparisons and confrontations between data and simulations. 7) implement a systematic verification and experimental validation procedure for the task force modelling activities. 8) implement systematic documentation on the task force modelling tools. 9) implement systematic reporting of the task force modelling results.

  5. The Task Force Project Structure

  6. Equilibrium and linear MHD stability (IMP#1) • Guido Huysmans (leader), Lynton Appel (deputy) • Experimental equilibrium reconstruction codes • standardise such codes (common interfaces, verification and benchmarking) • standardise input data format and links to equilibrium geometry • develop extended equilibrium reconstruction models (eg rotation effects) • Equilibrium codes and linear MHD stability • High precision equilibrium codes for refining experimental reconstructions • Standardised, validated linear ideal MHD stability codes • Extended linear MHD stability codes (resistive, 2-fluid, etc) • Identify experimental benchmarks (eg triggers for MHD events)

  7. Non-linear MHD and disruptions (IMP#2) • Franco Porcelli (leader), Sergei Sharapov (deputy) • To develop models for a range of non-linear MHD phenomena: • ELMs, including heat loss to SOL and subsequent heat loads • NTMs, including threshold, mode rotation, trigger, control, etc • Resistive wall modes, including rotation, feedback stabilisation, etc • Sawteeth, including trigger, crash and impact on confinement • Locked mode threshold • To model the disruption process: • The quench phase • Halo current fraction and disruption • Runaway electrons

  8. Transport code and discharge evolution (IMP#3) • David Coster (leader), Vincent Basiuk, D. Kalupnin, Vassili Parail, Grigori Pereverzev (deputies, code representatives) • To provide a modular core transport code, with modules for: • transport equation solvers (capable of dealing with stiff transport models) • sources and sinks • MHD phenomena • To provide a modular SOL/edge transport code • Flexible cross-field transport capability (diffusive, “blobby”, etc) • Sources/sinks • Plasma-wall interactions • Stability • Integrated discharge evolution • Core-edge integration through “pedestal” region (eg transition from 1D to 2D) • Full discharge evolution, including plasma control system

  9. Transport processes and micro-stability (IMP#4) • Bruce Scott (leader), Maurizio Ottaviani (deputy) • Linear micro-stability • Validate existing codes • Extend models to assess importance of additional physics • Plasma turbulence • Benchmark different codes against each other • Use output to suggest appropriate turbulence diagnostics to test codes • Explore mechanisms for transport barrier formation • Address the pedestal region, and core-edge integration • Model impurity and helium ash transport • Neoclassical transport • Validate and extend existing models for transport, currents and flow damping • Provide models for drifts in SOL

  10. Heating, current drive and fast particles (IMP#5) • Torbjörn Hellsten (leader), Yves Peysson, Fulvio Zonca (deputies) • Heating and current drive • Validate existing codes against experimental data and each other • Couple codes to explore synergistic effects between different schemes • Extend codes to deal with non-stationary, 3-D situations, eg for instability control • Extend and validate antenna codes (coupling rf to plasma, eg with ELMs!) • Model torques due to NBI and/or rf schemes • Fast particle instabilities and losses • Test codes against data and each other (including proposing expts) • Assess the impact of fast ptcles on large-scale MHD (NTMs, sawteeth, etc) • Fast particle losses (direct orbit loss, including ripple, and MHD-driven losses)

  11. the 2005-2006 work programme schedule 2005 2006 2007 prototype platform platform release CPP V&V proc. data management univ. access layer DCP V&V support code identif. standardisation V&V, documentation extend linear MHD codes IMP#1 edge MHD, core MHD, disruptions IMP#2 edge transport, core transport, integrated discharge evolutions IMP#3 code catalogue IMP#4 Linear mstab, turbulence, neocl. transport IMP#5 H&CD, fast particle instab. and losses

  12. Integration path • Not all project areas are however at the same level of readiness for direct integration – different approaches depending on status • Equilibrium reconstruction and linear MHD is probably most mature. • The work on micro-turbulence and instabilities less so. Focus here will initially be on establishing physical and computational standards: • Determination of necessary/sufficient physical content • Establish computational robustness before experimental validation

  13. Integration path • BP related effects tend to tie traditionally separated theory/modelling efforts more closely together: • self consistent equilibrium for evolving profiles in turbulence codes • Pfirsch-Schlüter currents --> Shafranov shift e.g., • avoiding ‘double counting’ in GK codes • Dynamic re-aligning of grids to changing magnetic equilibrium • Integration of turbulence codes with non-linear MHD • Trying to solve same part of the dynamical system!

  14. A possible structure for Task force codes Code Platform Project Framework tools • framework tools • interface technology • Standard “numerical” tools User Code/Script Interfaces Integrated Projects • Provides physics contents to “user” • Drives and specifies the platform • capabilities Integrated Projects Integrated Projects Integrated Projects Data IO Layer Integrated Projects Integrated Projects Data Coordination Project • Data i/o layer • “Device Independent” access to data. • Plug-ins for different access methods • Data and Database coordination for • Detailed Device specification • Verification tests & Validation expts. • Hardware management Networking tools MDS+ HDF5 ??? Databases

  15. Code Platform Project (CPP) • Bernard Guillerminet (leader), Mladen Stanojevic (deputy) • Where possible, existing codes will be dismantled to form a suite of modules • The code platform will provide a framework into which modules will be “plugged” to provide user designed purpose-built code • The platform will also aim to provide • interfaces with databases (joint effort with DCP) • interfaces with actual (and planned) machine geometry (coils, H&CD systems, etc.) (joint effort with DCP) • simulation data storage and retrieval facility (joint effort with DCP) • help with preparing input files • diagnostics codes to interpret experimental data, to aid tests of code predictions with data (joint effort with DCP) • monitoring a simulation run • Checking self-consistency of chosen modules

  16. Data Co-ordination Project (DCP) • Pär Strand (Interim leader), Francesco Iannone (deputy) • Responsible for • Developing the tools for data access • Developing and managing the databases required for the Task Force activities • Developing and maintaining V&V guidelines and coordinating experimental validation activities with the Integrated Modelling Projects • Main elements of the project: • Universal access layer (consistent with the Code Platform) • Software providing access to databases (remote and local) • Could ultimately incorporate data processing software • Data and database management • Provide support to the IMPs in experimental validation • Emerging technologies (eg assess grid technology for data sharing)

  17. Status of the projects • Imp#1, CPP and DCP are (almost) fully manned and active • Remaining IMP’s staffed at leadership level to be initiated “soon” Rationale: • IMP#1 provides fundamental functionality (equilibrium and geometry) for all other activities • IMP#1 also a pilot project for TF work structures and collaborative methods • DCP to provide interim solutions for data storage and access (active) – work towards flexible integrated database/data access solution for the TF needs (in planning/testing phases) • CPP to provide interface and coupling technology for IMPs and interface tools for users. In preparatory and planning stages, detailed technology choices are yet to be made. A LOT of preparatory work and material have been done (code catalogues, V&V procedures, ...) leading up to the project structure.

  18. Pilot Project: Equilibrium reconstruction • Tools (initial) • provide (a set of) equilibrium reconstruction codes • EFIT, EQUINOX, CLISTE, CEDRES • High resolution equilibrium and flux surface coordinate mapping codes • CHEASE, HELENA, CAXE • Linear ideal MHD Stability codes • CASTOR, MISHKA, KINX • Ongoing activities: • Definition of standard interfaces between codes • Abstraction of machine and diagnostic descriptions (codes to be made machine independent) • Reduction (recombination) and modularization of the different versions. See: http://crppwww.epfl.ch/~lister/euitmschemas/methods.htm http://crppwww.epfl.ch/~lister/euitmschemas/schemas.htm

  19. Equilibrium reconstruction and linear MHD (IMP#1)

  20. Complementary approaches? • US: (highly simplified representation) • Build a progressively more comprehensive BP simulation capability through a series of FII’s • Platform design can be tailored to needs and requirements • EFDA: (equally simplified) • Rapidly build a BP simulation capability (WDM) based on improved and tested versions of existing tools and deepen the physics fidelity through continuous model improvements and basic physics studies. • Hierarchy of models of different physics fidelity can be used to provide targeted simulations for physics studies • Different approaches to a shared long term goal. • At what level do we need to harmonize and coordinate “technology tools” and decisions? • What level of compatibility do we really need to share physics modules and to benchmark simulations?

  21. The “Integrator” – the transport project All areas of interest for BP will benefit from improved coordination, integration and expansion of the physics contents of the Integrated modelling projects. To fulfil longer term goals we need, in addition to more detailed and comprehensive physics models, also to integrate existing and newly developed models into a more closely coupled entity. IMP#3 – Transport Strong overlap of physics packages with IMPs • Equilibrium / linear MHD stability (IMP#1) • Sawteeth / ELM models (IMP#2) • Transport models (IMP#4) • Sources (NB, IC, EC, LH) (IMP#5) • PWI (TF-PWI) The transport project thus provides the “integrator” role for the TF

  22. Transport project activities Existing codes • ASTRA, CRONOS, JETTO, RITM (1d codes) • EDGE2D-NIMBUS, SOLPS (B2-EIRENE), … (2d codes) • … These codes already provide a number of physics “packages”. We need to • Identify these packages • Define standard interfaces • Extract physics packages • Recode to standard interfaces Existing codes can switch to using the standard interface to expand their physics options. Prototype effort already started as a joint effort with JET’s Integration Project (Parail) Physics packages would also be available for CPP and other new/extended codes

  23. Transport project activities As modules are extracted and standardised, opportunities for • Cleaning up code [Need standards and procedures] • Documenting code [Need formats] • Code-Code comparisons • Verification • Validation … Strong input from • Code Platform Project • Defining the future platform these modules will fit into: • Decisions to be made soon: input from this meeting will be important! • Data Coordination Project • Data for validation • Support on data structures, data access and tools Extremely important in order to obtain a “validated Suite of tools for ITER exploitation”

  24. Define the platform requirements Based on actors (code developers, code users, administrators) needs & use cases Principles for long lasting architecture: open source, formal or industry standards, components and layers High level draft on requirements to be released Find a few simulators or frameworks that Can fulfill our essential requirements Is extensible enough to meet future requirements Is future proof? (How long lifetime do we need? Demo in 35yrs!) Evaluate existing frameworks/tools to see which “best” fit the requirements Framework platform

  25. Existing code base Mainly in Fortran XX, with some C and C++ Valuable code base also in IDL and Matlab Different structures and traditions at different labs and countries (in particular with respect to numerical/commercial libraries). Code base varies (widely) in quality and modularity (as do developers) Computational resources and needs vary between end users, so far no real HPC center(s) in Europe need to rely on existing and distributed resources Code developers need tools to convert their code into appropriate components A support structure (expertise and tools) to couple different physics models. (Bridging different dimensionalities, boundaries, discretizations and solution methods) without introducing too much noise – need to be developed on a case by case basis? A sw environment to work within Framework platform

  26. Framework platform – End user needs • Describe the geometry and experimental input of the device: • => tools to handle 3D objects, magnetic geometry, and other inputs • Describe the coupling: • => tools for meshing, set boundaries, solvers and methods etc • Describe the time scheduling: • => tools to generate the workflow (branch, loop, …)

  27. Framework platform • Possible test projects • Port CRONOS (partially) • 3D views of the Tore Supra pumped limiter • List of candidates • Salome http://www.salome-platform.org • ???

  28. Summary • Task force is active within a new project structure designed to meet the requirements of providing a “validated suite of software tools for ITER exploitation”. • Physics projects activated or in planning • Equilibrium and linear MHD stability • Non-linear MHD and disruptions • Transport and discharge evolution • Micro-turbulence • H&CD and fast particles aimed at providing the physics background for a comprehensive BP device modeling, and to serve as incubators for improvements in existing tools and driver for new (needs driven) theory/modeling efforts. Modularity is a key concept. • Integrated modeling projects will provide a range of models and codes of different physics fidelity as standalone tools but also as modular packages.

  29. Summary • Full device modeling to be supported and (continously) improved through detailed physics studies (WDM supported by FII’s) • Physics projects supported through Code Platform Project and the Data Coordination project. • Platform (Framework) tools not decided, prototyping effort underway. • Code basis is mainly FortranXX, C and C++ with some tools and codes using IDL and Matlab code structures. • Initial need to maintain a large degree of compatibility with existing efforts (to leverage development resources) • Quality requirements on end product means a number of “legacy” codes and modules need to be revisited (acceptable coding practices, reduction of code base) • Distributed pool of users in a heterogeneous computing environment – portability and scalability of framework • Possibility of large datasets and load. • Wide array of requirements – no clear solution – or many possible approaches… your input is wanted!

  30. Problems expanding ITPA Confinement and ITB DBs Issues with divergence and extensions Important to have Flexibility (adding new data and types of data) Structured data Tight definitions of signals including units and signs. Common data access methods Benefits of Common definitions, structure and data access Centralised Relational DB to allow users to search for specific discharges A common framework for databases?

  31. Framework for data generation Lister, Roach and Walters • XML, XSLT, XSD, HTML • Definition of data • Takes master definition and create database definition • Creating of local files • Take database definition and local information to create XML file • Filling of data • Fill MDSplus tree from XML file and local data store

  32. Edge Transport Pre-process Core Transport Analysis Helena Jetto Prejet Edge2d Mishka Cronos Modex Solps Elite Kinezero UTC JET JAMS Architecture JAMS Architecture PPF ITER EU ? PPF EX Astra JAMS Catalogue System Catman PPF MDSplus

More Related