1 / 23

The PRISM infrastructure for Earth system models

The PRISM infrastructure for Earth system models. Eric Guilyardi, CGAM/IPSL and the PRISM Team. Background and drivers PRISM project achievements The future. Thematically. Why a common software infrastructure ?. Earth system modelling e xpertise widely distributed. Geographically.

hedmond
Download Presentation

The PRISM infrastructure for Earth system models

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The PRISM infrastructure for Earth system models Eric Guilyardi, CGAM/IPSL and the PRISM Team • Background and drivers • PRISM project achievements • The future

  2. Thematically Why acommon software infrastructure ? • Earth system modelling expertise widely distributed Geographically

  3. Why acommon software infrastructure ? • Earth system modelling expertise widely distributed • Scientific motivation =facilitate sharing of scientific expertise and of models • Technical motivation= the technical challenges are large compared with available effort • Need to keep scientific diversity while increasing efficiency – scientific and technical • Need for concerted effort in view of initiatives elsewhere: • The Frontier Project, Japan • The Earth System Modelling Framework, US

  4. PRISM concept « Share Earth System Modelling software infrastructure across community » To: • share development, maintenance and support • aid performance on a variety of platforms • standardize model software environment • ease use of different climate model components

  5. Expected benefits • high performance ESM software, developed by dedicated IT experts, available to institutes/teams at low cost: • - helps scientists to focus on science • - helps key scientific diversity (survival of smallers groups) • Easier toassemble ESMs based on community models • shared infrastructure = increased scientific exchanges • computer manufacturers inclined to contribute: • - efficiency (porting, optimisation) on variety of platforms • - next generation platforms optimized for ESM needs • - easier procurements and benchmarking • - reduced computing costs

  6. Running environment Share (i.e. f90) Software structure of an Earth System Model Coupling infrastructure Scientific core Supporting software

  7. Tomorrow Earth System model (Science) Standard support Library (incl. Env.) Fortran Compiler Hardware The long term view Towards standard ESM support library(ies) Climate science work Today Earth System model (Science + support + environment) Modeller IT expert Fortran Compiler Hardware

  8. The PRISM project • Program for integrated Earth System Modelling • 22 partners • 3 Years, from Dec 2001 - Nov 2004 • 5 Mill. € funding, FP5 of the EC (~80 py) • Coordinators: G.Brasseur and G.Komen

  9. System specifications The modelers/users: - requirements - beta testing - feedback The science : - General principles - Constraints from physical interfaces,… PRISM infrastructure The technical developments: The community models - Coupler and I/O - Compile/run environment - GUI - Visualisation and diagnostics - Atmosphere - Atmos. Chemistry - Ocean - Ocean biogeochemistry - Sea-ice - Land surface - … Let’s NOT re-invent the wheel !

  10. System specifications - the people Reinhard Budich - MPI, Hamburg Andrea Carril - INGV, Bologna Mick Carter - Hadley Center, Exeter Patrice Constanza - MPI/M&D, Hamburg Jérome Cuny - UCL, Louvain-la-Neuve Damien Declat - CERFACS, Toulouse Ralf Döscher - SMHI, Stockholm Thierry Fichefet - UCL, Louvain-la-Neuve Marie-Alice Foujols- IPSL, Paris Veronika Gayler - MPI/M&D, Hamburg Eric Guilyardi* - CGAM, Reading and LSCE Rosalyn Hatcher - Hadley Center, Exeter Miles Kastowsky ­ MPI/BCG, Iena LuisKornblueh- MPI, Hamburg Claes Larsson - ECMWF, Reading Stefanie Legutke - MPI/M&D, Hamburg Corinne Le Quéré - MPI/BCG,Iena Angelo Mangili - CSCS, Zurich Anne de Montety - UCL, Louvain-la-Neuve Serge Planton - Météo-France, Toulouse Jan Polcher - LMD/IPSL, Paris René Redler, NEC CCRLE, Sankt Augustin Martin Stendel- DMI, Copenhagen Sophie Valcke - CERFACS, Toulouse Peter van Velthoven- KNMI, De Bilt Reiner Vogelsang- SGI, Grasbrunn Nils Wedi - ECMWF, Reading * Chair

  11. PRISM achievements (so far): • Software environment (the tool box): • a standard coupler and I/O software, OASIS3 (CERFACS) and OASIS4 • a standard compiling environment (SCE) at the scripting level • a standard running environment (SRE) at the scripting level • a Graphical User Interface (GUI) to the SCE (PrepIFS, ECMWF) • a GUI to the SRE for monitoring the coupled model run (SMS, ECMWF) • standard diagnostic and visualisation tools • Adaptation of community Earth System component models (GCMs) and demonstration coupled configurations • A well co-ordinated network of expertise • Community buy-in and trust-building

  12. Outer shells Standard Running Environment Standard Compile Environ. PSMILe (coupling and I/O) Scientific core Historic I/O Inner shell The PRISM shells

  13. + PSMILe + PMIOD SRE SCE Levels of adaptation User Interface Adapting Earth System Components to PRISM PRISM Model Interface Library Potential Model IO Description

  14. SRE Driver User Interface SCE Binary executables Transf. disks Configuration management and deployment

  15. PrepIFS/SMS Web services Driver Driver Driver Deploy Transf. Transf. Transf. Instrumented sites Architecture A Architect. B Architect. C PRISM GUI remote functionality Config. PRISM Repositories (CSCS, MPI) Internet User

  16. Standard Runtime Environment SRE Standard scripting environments • Standard Compiling EnvironmentSCE

  17. Data processing and visualisation

  18. CGAM contribution (Jeff Cole) Demonstration experiments Platforms Assembled Coupled models

  19. Development coordinators • The coupler and I/O - Sophie Valcke (CERFACS) • The standard environments - Stephanie Legutke (MPI) • The user interface and web services - Claes Larsson (ECMWF) • Analysis and visualisation - Mick Carter (Hadley Centre) • The assembled models - Stephanie Legutke (MPI) • The demonstration experiments - Andrea Carril (INGV)

  20. Community buy-in • Growing ! • Workshops and seminars • 15 pionneer models adapted (institutes involvement) • 9 test super-computers intrumented • Models distributed under PRISM env. (ECHAM5, OPA 9.0) • Community programmes relying on PRISM framework (ENSEMBLES, COSMOS, MERSEA, GMES, NERC,…) • To go further: • PRISM perspective: maintain and develop tool box • Institute perspective: get timing and involvement in next steps right

  21. Running environment Coupling infrastructure Scientific core Supporting software Collaborations • Active collaborations: • ESMF(supporting software, PMIOD, MOM4) • FLUME(PRISM software) • PCMDI (visualisation, PMIOD) • CF group (CF names) • NERC (BADC & CGAM) (meta-data, PMIOD) • M&D, MPI (data) • Earth Simulator (install PRISM system V.0) PRISM has put Europe in the loop for community-wide convergence on basic standards in ES modelling

  22. PRISM Final Project Meeting De Bilt, October 7-8, 2004

  23. Set-up meeting held in Paris Oct 27 2004 The future • PRISM has delivered a tool box, a network of expertise and demonstrations • Community buy-in growing • Key need for sustainability of • tool box maintenance/development (new features) • network of expertise • PRISM sustained initiative

More Related