1 / 34

Coordinated Activities on Evaluation of Collisional Data for Fusion Applications

Coordinated Activities on Evaluation of Collisional Data for Fusion Applications. H.-K. Chung and B. J. Braams Atomic and Molecular Data Unit, Nuclear Data Section Division of Physical and Chemical Sciences, IAEA, Vienna October 1-4, 2012 ICAMDATA-8, Gaithersburg, MD.

marlon
Download Presentation

Coordinated Activities on Evaluation of Collisional Data for Fusion Applications

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Coordinated Activities on Evaluation of Collisional Data for Fusion Applications H.-K. Chung and B. J. Braams Atomic and Molecular Data Unit, Nuclear Data Section Division of Physical and Chemical Sciences, IAEA, Vienna October 1-4, 2012 ICAMDATA-8, Gaithersburg, MD

  2. Contributors to this presentation (direct) I. Murakami, D. Kato, T. Nakano, Y. Itikawa, Y. Nakamura, Alex M Imai, H. Takagi, T. Kato, F. Koike (Japan), G.W.F. Drake, D. R Schultz, A. Kramida, P. Krstic, E. Landi, C. Ballance (USA), J.-S. Yoon, M-Y. Song, H. Cho, C.-G. Kim, J.-O. Choi (Korea), J. Yan, G. Liang (China), M. O'Mullane, N. Mason, K. Aggarwal (UK), D. Reiter, D. Coster, J. Roth (Germany), V. Kumar (India), S. Buckman (Australia), V. Shevelko (Russia), G. Karwasz (Poland), S. Lisgo (ITER)

  3. Outline • IAEA Atomic and Molecular Data Unit • Needs of Evaluated Data for Fusion Applications • IAEA Coordinated Activities on Evaluation of Collisional Data for Fusion Applications • Future Activities

  4. Who we are and why we are here IAEA atomic and molecular Data Unit

  5. IAEA : Accelerate and enlarge the contribution of atomic energy to peace, health and prosperity IAEA A+M Unit formed in 1977 • 1976 at Culham Laboratory, UK Review progress and achievements of A+M/PSI data for Fusion program Stimulate international cooperation in measurement, compilation and evaluation of A+M/PSI data for fusion 155 Member States 2200 Staffs Assists its Member States, in the context of social and economic goals, in planning for and using nuclear science and technology for various peaceful purposes, including the generation of electricity.

  6. International Coordination of A+M/PSI Data Research for Fusion Consultants Meetings (CM) Publications (INDC, APID, Bulletin) Coordinated Research Projects (CRP) Fusion Plasma Modelling Measurements Theories Technical Meetings (TM) Data Compilation Data Production Databases (AMBDAS, GENIE, ALADDIN, Wiki…) Data Evaluation

  7. Online AM/PSI Data Services: http://www-amdis.iaea.org

  8. Network Collaboration for AM/PSI Data for Fusion Fusion Laboratories ITER EFDA JET, UKAEA ASDEX-Upgrade, IPP TEXTOR, Jülich, FZJ KSTAR, NFRI NIFS, JAEA PPPL, ORNL Data Users Data Centre Network ADAS, Summers H. CRAAMD, Jun, Y. IAEA, Braams, B. J. JAEA, Nakano, T. KAERI, Rhee, Y. Kurchatov, Martynenko, Yu. NIFS, Murakami, I NIST, Wiese, W.L. NFRI, Yoon, J ORNL, Schultz, D. R. Code Centre Network Curtin Univ. I. Bray Kitasato Univ. F. Koike Univ. Autonoma de Madrid I. Rabadan Univ. P&M. Curie, Paris, A. Dubois Univ. of Bari, M. Capitelli Kurchatov Institute, A. Kukushkin Lebedev Institute, L. Vainshtein FZJ, D. Reiter Ernst-Moritz-Arndt Univ, R. Schneider NIST, Y. Ralchenko PPPL, D. Stotler LANL, J. Abdallah Jr. IAEA, B. J. Braams HULLAC M. Klapisch CNEA, P.D. Fainstein Data Producers Data Centres & Evaluators IAEA Coordination CRP Publications Knowledgebase Databases Meetings

  9. Data Users Need: Evaluated and recommended data • Code Centre Network Meeting (October 2010) • CCN is organized to improve Online Code Capabilities to provide needed data for Data Users, particularly, Plasma Modellers • Data Users (D. Coster, D. Reiter, R. Schneider, D. Elder) participated to interact with the code producers • Discussions • Online Codes generate too many data sets without quality information • Data Users need Complete sets and/or Recommended data • At every meeting, data evaluation / quality was an issue • IAEA Meetings • VAMDC (Compilation & Distribution)  SUP@VAMDC

  10. From user’s perspectives Needs of Evaluated Data for Fusion Applications

  11. Typical edge transport code runtime (for same model, same equations, same grid size) ITER (R=6.2 m), Cadarache, FRA 3 months JET (R=2.96 m), Oxford, UK TEXTOR (R=1.75 m) Jülich, GER 1-2 weeks 1 day Because of more important plasma chemistry (increased non-linearity, non-locality, in sources).

  12. Requirement for Plasma Modelling: Reliable data sets for AMNS data “Robust” Data • Data needs to be “verified” by an expert • Data needs to be “robust” • Data needs to be “comprehensive” • Data needs to be easy to use • Version control: know what data was used for a particular run, and which runs used a particular version of the data • Needs to be efficient • Needs to be able to address special needs Require a Recommended & Internationally Agreed Library for Atomic, Molecular, Nuclear, Surface Data “Verified” Data

  13. A series of meeting to organize community efforts for evaluation and recommendation of A+M/PSI Data IAEA Activities on evaluation and recommendation

  14. Data Centre Network Meeting (2011) • Data Evaluation Tasks are Difficult • Lack-of man-power: Experts retiring or leaving the field • No young people get in the field (no publication, no funding) • Evaluation requires multiple sets : Too many or too few • Very few benchmark experiments for collisional data • Even fewer uncertainty estimates for theoretical data • Conclusions • Data should be first collected and available for evaluation • Evaluation activities should be organized in the community • Evaluation guidelines should be established in the community • A list of recommended data sets should be available as a final product • Current status • NIST: Critical evaluation of atomic structure and transition probabilities • NFRI/KRISS: National efforts to establish standard data sets • NIFS/JAEA: Evaluated data libraries, Collaboration • IAEA/ORNL: ALADDIN, individual consultancies

  15. Coordination Meetings for Evaluationhttp://www-amdis.iaea.org/DCN/Evaluation/

  16. IAEA-NFRI TM on Data Evaluation:http://www-amdis.iaea.org/meetings/NFRI2012/ • 24 Presentations (2.5 days) and 1 day Technical Discussions • Topics (Focused on Reaction Data) • Current Evaluated Databases (Kramida, Landi, Mason) • Evaluation Methods and Experiences (Itikawa, Kumar, Cho, Karwasz) • Error Propagation and Sensitivity Analysis (O’Mullane, Ballance, Reiter, Krstic) • Theoretical Data Evaluation (Aggarwal, Liang, Takagi, Song) • Experimental Data Evaluation (Nakamura, Buckman, Shevelko, Imai) • Data Centres Evaluation Activities (Yoon, Murakami, Mason, Chung) • Participants from Australia, China, Germany, India, Japan, Korea, Poland, Russia, UK, USA and IAEA.

  17. Summary of Discussions Community Role • Involve the community in data evaluation • Engage young generation for transfer of knowledge • Define terminology and vocabulary used in evaluation • Define common workflow guidelines Technical Issues • Assessment for theoretical data • Assessment of experimental data • Error propagation and Sensitivity Analysis

  18. Community Role: Consensus Building • Change of notions: Databases  Data research • Engage young generation (in early career) • Culture should teach that data evaluation is a critical part of scientific work • Publication Issues: Review publications are good for career development • Funding Issues: Evaluation leads to the gap of understanding of the field and finds interesting problems • Disseminate materials to train students and researchers with the “Critical Analysis Skills” • Disseminate the standard definitions of terminologies adopted by international organizations (IAEA, IUPAC, IUPAP, BIPM, ISO, WHO, FAO, etc) • Agree on the procedure of evaluation towards a standard reference data

  19. Define Terminology: Uncertainty ApproachIt’s NOT AN ERROR but AN UNCERTAINTY • Terminology in metrology • VIM (Vocabulaire International de Métrologie, Bureau Int. des Poids et Measures) 2007 • GUM (guide to the expression of uncertainty in measurement) 2008 • Measurement and uncertainty • The objective of a measurement is to determine the value of the measurand (GUM) • In general, a measurement has imperfections that give rise to an error in its result. • Error 1 = Measurement result – True value (Error approach) True value : value consistent with the definition of a given particular quantity • Error 2 = Measured value – Reference value (Uncertainty approach) Reference value (Assigned value): The reference quantity value can be a true quantity value of the measurand, in which case measurement error is unknowable, or an appropriate, known quantity value such as a conventional quantity value or a specified target quantity value to be realized in a production process. uncertainty value value and uncertainty

  20. Any measurement has an uncertainty Value and uncertainty Time, method, place, person, procedure

  21. Uncertainty Approach based on VIM & GUM: The true value lies within the uncertainty range

  22. Common Workflow Guidelinesfor Evaluation of Collisional data Workflow of critical evaluation of data on wavelengths and energy levels (NIST) Advantages: • Easier to expand the evaluators’ network including early career researchers. • Introduce more rigorous procedures for evaluation and increases the dependability of the evaluation. Disadvantages: • The quality of evaluation critically depends on the experiences of the evaluators. • It is possible that different people may reach at different conclusions using the same guidelines and the results may not be reproducible. Solutions: • Collaborations can help reducing the disadvantages. • Evaluation activities by scientific advisors and editorial panels will be a great mechanism to produce the evaluated data library.

  23. Evaluation by Editorial Board Panel • Evaluation and recommendation require the community consensus with an endorsement from the IAEA or other international authorities • Establishment of the evaluation guidelines: will evolve with time and experience with broad collaborations from the community • Group Evaluation: 4-5 panelists including young and senior people like the editorial board for a journal with the broad backgrounds (experimentalists, theoreticians, producers and users) • Self-Evaluation: Data producers with a deep knowledge in some cases. May work better for theoretical data sets. • Merits of Group Evaluation: • Facilitate the knowledge transfer to younger generation • Review papers can be written for the evaluation work • Make the data research project visible to the Community

  24. An example of evaluation towards a Standard Reference Data Evaluation (NFRI) Final Evaluation (Panel Decision) 1st evaluation (experts) Data Compilation

  25. Summary of Discussions Community Role • Involve the community in data evaluation • Engage young generation for transfer of knowledge • Define terminology and vocabulary used in evaluation • Define common workflow guidelines Technical Issues • Assessment for theoretical data • Assessment of experimental data • Error propagation and Sensitivity Analysis

  26. Theoretical Data Evaluation • No criteria of assessments for theoretical data • A Critical need: guidelines for uncertainty estimates of theoretical data • Should not try to give a straight recipe for assessing uncertainties, however, there are still several to start with. • There are prescriptions such as energy grids for resonances and partial waves. • One may take a model to see a convergence and estimate uncertainties based on assumptions within the model. • Comparisons with experiments: this can be dangerous. • Comparisons among different theories: if some theories are better than another, it may be given a benchmark status. • For scattering data maybe we should aim for “ideas” or “suggestions” rather than “guidelines”. • Theoreticians may have an idea of uncertainty estimates already • Journal policies can change the culture: PRA policies

  27. Experimental Data Evaluation • Check Lists • Uncertainty estimates or error assessment critical • Self-consistencies checked • Experimental techniques evaluated. • Reputation of the data producer considered • Anomalies in some experimental processes (ro-vibrational / metastable) • Wish Lists • Evaluation by a group of “established’ experts with broad expertise • Provide Recommended values where possible • Include a comparison with theory and an assessment of overall status • Evaluation will lead to the understanding of the gaps of the field • Establish “benchmarks” where possible:

  28. Error Propagation and Sensitivity Analysis: Uncertainties in “Data” & “Data Processing Toolbox” ? • Atomic structure • collision codes fundamental data Processed data (rate coef.) CR transition matrix A = A_excitation + A_radiative + A_ionization + A_recombination + A_charge-exchange …. effective rates, population coefficients cooling rates, beam stopping rates,…. experimental data Velocity distribution: Boltzmann solver, Maxwellian Monte Carlo Linear algebra, ODE solvers Sensitivity,error propagation to final model results: PDEq, IDEq,… leave to modelers, spectroscopists

  29. Need the community feedback and support Future activities

  30. Near-term goals • Priorities for evaluation • Electron scattering on Beq+. (IAEA CRP) • Electron scattering on CH4. (NFRI group) • Charge exchange and electron loss for H on Beq+. • Action Items • Developing an evaluators network – Key people identified • Inventorise datasets that are now used by fusion plasma modellers • Sketch out guidelines for uncertainty assessment of theoretical data • Organize a workshop (SUP@VAMDC) • NFRI to organize a data evaluation group for demonstration • The ITER project should recognize the need of standard reference data (SRD) for A+M processes used in the design

  31. Long-term goal…. Global Network towards the Internationally Agreed Data Library for Fusion and other Plasma Applications Data Needs Data Users Data Compilation, Evaluation and Recommendation Data Evaluators Experimental and Theoretical Data Production Data Producers

  32. The development of a standard library: Document data sets used by the fusion community Priority List of Critically Needed Data Database of Available Data for Evaluation Basis of Evaluated Data Library

  33. Evaluation of evaluated data sets:a prototype of evaluated data library

  34. Summary • The series of IAEA meetings including the Joint IAEA-NFRI TM on Data Evaluation were highly successful in drawing consensus from participants on the coordinated data evaluation activities by the community. • Disseminate the concepts of VIM3 (International Vocabulary of Metrology), GUM (Guide to the expression of Uncertainty in Measurement) and “Critical Assessment Skills” • Engage younger generation in the process • Collaborate with colleagues in the community • Change the culture about data research with publications • IAEA A+M Data Unit will actively participate in organizing and coordinating the community effort in the data evaluation activities, ultimately towards the standard data library for fusion applications. • Assess the needs of user communities • Collaborate with SUP@VAMDC • We urge you, the community to join us in the data evaluation activities that will benefit data users, producers and evaluators in the future.

More Related