1 / 29

Common Modeling Infrastructure: ESMF to NUOPC to GIP

Common Modeling Infrastructure: ESMF to NUOPC to GIP. Cecelia DeLuca NOAA ESRL/CIRES May 18, 2010. Outline. Common Modeling Infrastructure ESMF Part 1: Prototype Part 2: Clean Up Part 3: NUOPC and Applications GIP Summary. Origins.

Download Presentation

Common Modeling Infrastructure: ESMF to NUOPC to GIP

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Common Modeling Infrastructure:ESMF to NUOPC to GIP Cecelia DeLuca NOAA ESRL/CIRES May 18, 2010

  2. Outline • Common Modeling Infrastructure • ESMF • Part 1: Prototype • Part 2: Clean Up • Part 3: NUOPC and Applications • GIP • Summary

  3. Origins • The Common Modeling Infrastructure Working Group (late 1990s) • Chaired by Steve Zebiak/IRI and Robert Dickenson/GA Tech • Brought together research and operational groups, several of which had developed institutional frameworks: GEMS at NASA, Flexible Modeling System at GFDL • Members were motivated by and participated in reports and papers calling for common infrastructure [1,2,3] • Experimented with Kalnay rules for physicsinteroperability [4] • Formulated a collective response to a NASA solicitation calling for an Earth System Modeling Framework - ESMF (2001)

  4. ESMF Part 1: The Prototype • First round: Three linked proposals to NASA Earth Science Technology Office (PIs Killeen/NCAR, da Silva/NASA, Marshall/MIT, 2002) • Focused on a layered architecture: ESMF scope included a utility layer (parallel communication, time management, error handling) and a coupling layer, with user code sandwiched in between: Components Layer: Gridded Components Coupler Components ESMF Superstructure User Code Model Layer ESMF Infrastructure Fields and Grids Layer Low Level Utilities External Libraries BLAS, MPI, NetCDF, …

  5. ESMF Part 1: Goals Aim for models that are: • Scalable in complexity Models are built from modular components, and can be nested within larger applications • Performance - portable ESMF high-performance communication libraries offer a consistent interface across computer architectures • ExchangeableStandard component interfaces enable interoperability

  6. ESMF Part 1: Successes • ESMF scope and architecture defined [5] • The GEOS-5 atmospheric GCM used ESMF extensively • Hierarchicalarchitecture, shown atright • Each box is a componentwith standard interfaces • Many functions filled inby NASA GEMS • ESMF created a network of technical collaborators

  7. ESMF Part 2: The Clean-Up • Second round support came from the DoD Battlespace Environments Institute, NASA Modeling Analysis and Prediction Program, and NOAA NWS • ESMF v3 (start 2005) • Restructured the development team • Rewrote central data structures for greater performance and flexibility • ESMF v4 (start 2006) • Rewrote the grid and regridding software

  8. ESMF Part 2: Successes • Performance, portability, robustness • Unit/system test suite, regression tested on 30+ platforms, performance overhead negligible (typically <3%), bug fixes • Capability • Multiple modes of coupling,logically rectangular orunstructured grids • Adoption • Used in CCSM4, GEOS-5,COAMPS, GFS,NEMS, TIMEGCM, andother codes

  9. ESMF Part 2: Multi-Agency Governance ExecutiveManagement Executive Board Strategic direction Organizational changes Board appointments annually Reporting Joint meetings Review Committee External review Interagency Working Group Stakeholder liaison Programmatic assessment & feedback Reporting Working Project Joint Specification Team Requirements definition Design, code and other reviews External code contributions weekly quarterly Functionality change requests Change Review Board Development priorities Release review & approval Collaborative design Beta testing Potential standardization tasks Core Development Team Software project management Software development of ESMF Development of NUOPC Layer (NEW) Testing & maintenance Distribution & user Support Resource constraints Proposed NUOPC standards Implementation schedule monthly NUOPC Content Standards Committee (NEW) Conventions for physical constants, documentation, metadata, etc. daily Community standards Input into NUOPC standards

  10. ESMF Part 3: NUOPC and Apps • National Unified Operational Prediction Capability (NUOPC) aims to develop an operational multi-model for numerical weather prediction • ESMF as a technical foundation for component interoperability • The level of interoperability desired requires greater specification than ESMF alone provides • Solution: create NUOPC Layer, with areas of activity outlined in a NUOPC Common Model Architecture (CMA) report [6] • New committees: CMA (Chairs Lapenta/NCEP and McCarren/Navy), Content Standards Committee or CSC (Chairs Campbell/NRL)

  11. NUOPC Layer Encode interoperability rules in code and guidance documents: • Code templates, including component and coupler templates, for describing software structure that is not part of ESMF proper • Additional rules (e.g. component sequencing, data access) encoded in ESMF • Content standards, including metadata and physical constants, expressed in schema, code modules, and/or guidance documents • Usage conventions, where rules cannot or should not be encoded in software, outlined in guidance documents • Compliance verification software, to automate checks on component compliance

  12. Anticipated Results • Standardized implementation of ESMF across NOAA, Navy, and NASA applications • Demonstrably improved level of interoperability, aiming for target level described in the CMA report (Appendix 1) • Reconciliation of NASA/MAPL, NEMS, and Navy ESMF infrastructure, with the resulting NUOPC Layer supported by the ESMF core team

  13. Development Strategy CMA report: • Determine the level of interoperability desired • Recommend general solutions (REC in CMA report) Post-CMA report • CMA determines application milestones • ESMF Change Review Board prioritizes development tasks in each REC area • For each development task: • Design of solution and verification strategy for adoption • Implementation of framework code, tests, and documentation • Implementation of compliance checks • Beta release and implementation in application prototypes • Refinement of code in response to feedback • Production release and implementation in operations

  14. Application Milestones (est.) April2010 April2011 April 2012 Single-column atm model Coupled atmosphere-ocean Ensemble implementation single component multi-component ensemble

  15. Reconciliation Strategy • For each application milestone: • Compare NASA GEOS-5, NEMS and NRL (COAMPS and NOGAPS) implementation • Migrate common, merged functionality into ESMF or NUOPC software distribution, test and document • Update prototype application codes, including NOGAPS • Refine and implement in production code

  16. First Step: Single Column Model • Motivation and approach • Define and execute a inter-agency project to exercise the CMA/CSC interoperability standards • A development tool that benefits all participating modeling centers • Outcome will serve as a foundation for building the NUOPC Layer • Next steps will be extending the NUOPC Layer to coupled systems and then ensembles • The Single Column Model (SCM) • A SCM is a one-dimensional time-dependent version of a fully three-dimensional modeling system • A tool generally used for the development of physics code • Useful for testing new parameterizations • Computationally efficient

  17. NUOPC Layer Task Estimates

  18. Initial Prioritization from CSC Examples • Priority 1: • Convention for data ownership • Convention for use of Clocks • Determine Component and Field metadata • Priority 2: • Establish portability requirements and implement • Implement Component and Coupler templates • Conventions for the intake of externally calculated interpolation weights

  19. Risks • Failure to implement acceptable solutions • Maximize communication and involvement of application groups in development • Failure to adopt framework and conventions in applications • Recognize good faith involvement (e.g. contact user support with questions and problems when they occur) • Reserve resources for implementation in application codes • Implement automated tests for compliance wherever possible

  20. Beyond ESMF: GIP • Global Interoperability Program (2009) • Focuses on development of infrastructure for a range of application areas in Earth science modeling: • Climate simulation • Application of climate information • Weather and water forecasting • Training modelers • Focuses on modeling workflows from configuration to data dissemination

  21. GIP: Building Connections Entries are representative, not comprehensive!

  22. GIP: Campaigns • High-priority activities that focus community efforts • Used in GIP to define projects and assess impacts

  23. GIP: Status • Projects submitted for FY11 • Include: • Increasing usability of NCEP forecast models • Distribution of climate model data in GIS formats • Examine NUOPC Layer in CCSM • Summer School in Atmospheric Modeling (focus on federal models) • Core support for ESMF • International involvement with links to E.U. based METAFOR metadata and IS-ENES projects More at http://gip.noaa.gov

  24. Summary • Common Modeling Infrastructure has been evolving for more than a decade: • CMIWG, ESMF, NUOPC, GIP • Capabilities, scope, and adoption are increasing • Science collaborations (e.g. BEI, NUOPC) are starting to build upon interface standards and tools • International networks are emerging • Many successes, but still more to do to improve interoperability!

  25. References 1 Dickenson, R.E., S.E. Zebiak, J.L. Anderson, M.L. Blackmon, C. DeLuca, T.F. Hogan, M. Iredell, M. Ji, R. Rood, M.J. Suarez and K.E. Taylor (2002) How Can We Advance Our Weather and Climate Models as a Community? Bulletin of the American Meteorological Society, Volume 83, Number 3, pp. 431-434. 2 Improving the Effectiveness of U.S. Climate Modeling, National Research Council of the National Academies, National Academies Press, 2001. • High-End Climate Science: Development of Modeling and Related Computing Capabilities, Report to the USGCRP from an ad hoc Working Group on Climate Modeling, December, 2000. • "Rules for Interchange of Physical Parameterizations", E. Kalnay, M. Kanamitsu, J. Pfaendtner, J. Sela, M. Suarez, J. Stackpole, J. Tuccillo, L. Umscheid and D. Williamson, Bull. Amer. Met. Soc., 70, 620-622, 1989. • Hill, C., C. DeLuca, V. Balaji, M. Suarez, and A. da Silva (2004). Architecture of the Earth System Modeling Framework. Computing in Science and Engineering, Volume 6, Number 1, pp. 18-28. • Final Report from the National Unified Operational Prediction Capability (NUOPC) Interim Committee on Common Model Architecture (CMA), June, 2009.

  26. All Years • Nightly regression testing and release management (REC 3.1.1,1 FTE) • Compiler and platform updates (REC 3.1.1, .2 FTE) • Functional updates in response to feature requests and bug reports (REC 3.1.1, 3 FTE) • Routine support requests (REC 3.5.1, .8 FTE) • Longer-term adoption support (REC 3.7.1, 1 FTE) • Performance evaluation and reporting (REC 3.3.1, .5 FTE) • Tutorials and training (REC 3.6.1, .5 FTE) • Project administration, including boards and meetings, contracts and finance, staffing, planning, reporting (REC 3.1.1, 1 FTE) • Project operations, including updates to website, repository, trackers and other tools, project metrics, code backup, computer accounts (REC 3.1.1, 6.1.4, 1 FTE) TOTAL ~ 9 FTE

  27. Year 1 Development • Finalize and implement organizational plan – including reporting, management and staffing for distributed development and support teams, and funding vehicles. • Set up joint website, trackers, lists, and other communication and management infrastructure, initial code distribution infrastructure, and initial repository access and policies. • Prototype the component template and highest level coupler template, document them, and distribute them via the web. • This activity must address aspects of the common physical architecture. • Examine relationship of NUOPC templates to MAPL and develop interoperability plan. • Other code and convention development activities as prioritized by the ESMF Change Review Board.

  28. Year 2 Development • Migrate ESMF code to Subversion • Assess and evolve NUOPC-wide code distribution and repository strategy. • Finalize development of the component and highest-level coupler templates and distribute. • Prototype diagnostics, postprocessing and IO templates and distribute. • Refine and distribute common physical constants module. • Finalize component, field, and grid metadata packages. • Develop initial conventions for configuration files, working closely with GFDL, AFWA, etc. • Other code and convention development activities as prioritized by the ESMF Change Review Board.

  29. Year 3 Development • Finalize development of diagnostics, postprocessing, and IO templates and distribute. • Refine conventions for configuration files. • Clean up documentation and prepare training materials. • Other code and convention development activities as prioritized by the ESMF Change Review Board.

More Related