1 / 22

WRF ESMF Development

WRF ESMF Development. Tom Henderson, John Michalakes National Center for Atmospheric Research Mesoscale and Microscale Meteorology hender@ucar.edu 4 th ESMF Community Meeting -- 21 July 2005. Acknowledgements. Significant support Air Force Weather Agency, esp. Jerry Wegiel DoD HPCMO

theodore
Download Presentation

WRF ESMF Development

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. WRF ESMF Development Tom Henderson, John Michalakes National Center for Atmospheric Research Mesoscale and Microscale Meteorology hender@ucar.edu 4th ESMF Community Meeting -- 21 July 2005

  2. Acknowledgements • Significant support • Air Force Weather Agency, esp. Jerry Wegiel • DoD HPCMO • USWRP (NOAA and NSF) • Many collaborators/contributors • NCAR, NOAA, DoD, universities, vendors

  3. Implementation of WRF Architecture Hierarchical organization Multiple dynamical cores Plug compatible physics Abstract interfaces (APIs) to external packages Performance-portable Metaprogramming “Registry” for managing model state Top-level Control, Memory Management, Nesting, Parallelism, External APIs driver ARW core NMM core Physics Interfaces mediation Plug-compatible physics Plug-compatible physics Plug-compatible physics Plug-compatible physics model Plug-compatible physics WRF Software Framework Overview

  4. WRF Registry • Specify model state arrays, I/O, coupling, interprocess communication (when needed), nesting interpolation, etc. in ASCII text file called “Registry” • WRF build reads Registry file and automatically generates source code for: • Data members for WRF “domain” objects • I/O, coupling, and communication calls • Actual and dummy arguments in mediation-layer calls

  5. New Feature: Moving Nests • John Michalakes augmented 2-way interactive nesting in WRF to allow reorientation of nested domain with respect to parent • Automatic movement algorithms such as U. Miami vortex following scheme • Automatic ingest of nested-resolution terrain and other lower boundary data to initialize leading edge of moving nest • HyCOM coupling (UNDER DEVELOPMENT) • Supports 2 or more nest levels in telescoping configuration • Parallel and efficient: small additional overhead (~2%) on top of 5-8% overhead for non-moving 2-way nesting. • With S. Chen, J. Cangialosi, W. Zhao (RSMAS, U. Miami) and S. Gopal at NCEP. Software infrastructure development supported by NOAA/NCEP (for use with NMM core). • Fully implemented in ARW Core for use in RAINEX and real-time hurricane forecasting this coming season… • Following animations done by John Michalakes

  6. Five-day Hurricane Ivan 12km/4km Moving Nest Two-way interacting nest with high-resolution terrain ingest at leading edge 400 x 301 x 35, dt = 72 sec 331 x 352 x 35, dt = 24 sec Best track Run time: 8.6 hours on 64p IBM Power 4 (AFWA), including 20 minutes I/O

  7. Five-day Hurricane Ivan 12km/4km Moving Nest Two-way interacting nest with high-resolution terrain ingest at leading edge 400 x 301 x 35, dt = 72 sec 331 x 352 x 35, dt = 24 sec Run time: 8.6 hours on 64p IBM Power 4 (AFWA), including 20 minutes I/O

  8. Five-day Hurricane Ivan 12km/4km Moving Nest Two-way interacting nest with high-resolution terrain ingest at leading edge 400 x 301 x 35, dt = 72 sec 331 x 352 x 35, dt = 24 sec Run time: 8.6 hours on 64p IBM Power 4 (AFWA), including 20 minutes I/O

  9. Goals • Generic interoperability with other ESMF components... • WRF + “ocean model” ( + “wave model”) • Hurricane applications – HWRF • WRF + 3DVAR / 4DVAR • Decouple requirements • WRF + “land model” for regional climate modeling • CLM == CCSM's land model • First cut: call CLM as WRF subroutine (CAM-like) • Jimy Dudhia (MMM), Mariana Vertenstein (CGD) • Prototype complete • Second cut: couple two ESMF gridded components • Need CLM ESMF component – early 2006 • Compare performance “apples-to-apples”

  10. Work Completed • Completed draft of WRF-ESMF Integration Plan • Reviewed by WRF SE working group (WG-2) • Completed “stand-alone” component • Finished “init”, “run”, “final”, “set services” • Built trivial “AppDriver” • Tested vs. standard WRF • Included in WRF 2.1 release • Useful for testing, not useful otherwise…

  11. Work In Progress • Create and populate ESMF “import” and “export” states • Attach references to ESMF_State objects to WRF “domain” objects at top level • Implement operations on ESMF_State objects as new external I/O package via WRF I/O & Coupling API • No changes to WRF software framework • ESMF-specific code only in new top-level driver and in external I/O package • Customize contents of ESMF_State objects via WRF Registry and WRF namelist • Prototype code complete, testing in progress

  12. Work In Progress • Couple WRF with simple component via simple coupler • “Dummy ocean model” • Read SST from file and send to WRF • Receive SST from WRF and compare to file data for self-test • Coding under way • Test to validate prototype code

  13. Future Plans • Couple to other ESMF components • HYCOM, CLM, 3D/4DVAR, LIS (NASA), … • Currently limited to “trivial” cases due to lack of support for general curvilinear coordinates in ESMF • Now a high priority for the core team • Will need “moving nests” for coupled hurricane applications • Also, need for “single executable” makes use of ESMF more difficult than existing WRF coupling methods for some models • Extension to “PSMF” • Planetary WRF • NASA, CalTech, etc.

  14. Future Plans • Maximize coupler re-use by minimizing dependencies on gridded components • Climate and Forecast (CF) metadata conventions • www.cgd.ucar.edu/cms/eaton/cf-metadata/index.html • Use for all WRF I/O and coupling • Specify CF “standard_name” via WRF Registry • Use Balaji’s solution for Arakawa grids • Model metadata • Time, etc. • GO-ESSP: go-essp.gfdl.noaa.gov

  15. A Few Issues • Orthodox Canonical Form • Time management • Attaching ESMF objects to pre-existing component objects • Destruction of ESMF objects

  16. Orthodox Canonical Form • Key best practice for statically-typed object-oriented languages that do not support native garbage collection like C++ (and Fortran90) • Coplien: Advanced C++ Programming Styles and Idioms • Meyers: Effective C++ and More Effective C++ • Objects must have an assignment operator and copy constructor • ESMF “deep” objects do not have assignment operators • See ESMF Reference Manual Section 8.2 • Some do not have copy constructors, but will eventually

  17. Orthodox Canonical Form • Lack of assignment operators causes confusion even in ESMF demo code! • CoupledFlowDemo.F90 ! Make our own local copy of the clock localclock = clock • Thanks to Nancy for fixing this after I whined… • All ESMF deep objects must have assignment operators with uniformly well-defined behavior!

  18. Options for Deep Object Assignment • Option 1: Assignment behaves like standard Fortran assignment – it looks like a deep copy • Easy to explain and use • Difference between “deep” and “shallow” is performance • Future changes from shallow to deep will not break user code • Difficult to implement efficiently • Optimize using copy-on-write idiom via reference counting (Meyers, and many others too) • Is it possible to re-use an existing implementation?

  19. Options for Deep Object Assignment • Option 2: Assignment behaves like reference-copy (assignment of Fortran logical unit numbers) • Easier to implement • Not very helpful to users – this can already be done via Fortran pointers! • Option 3: Assignment fails • Unless the assignment is a return value from a “Create” call • Insane for sure! • But still better than nothing

  20. Initialization of “master clock” • Gridded components may already have expertise to initialize their own clocks • Dependencies • Time step: horizontal resolution, physical processes, etc. • Start and stop time: input and boundary data sets, time step, … • Avoid duplicating expertise in couplers/drivers • No “empty” ESMF_Clock (like “empty” ESMF_State) • Instead, pass time information back to driver/coupler through “init” interfaces as exportState metadata • Driver/coupler resolves inconsistencies between components and creates top-level clock (or aborts) • “Model metadata” conventions useful here

  21. A Few More Issues • Attaching ESMF objects to pre-existing component objects • ESMF_VMGetCurrent() – extend to other types • Attach pointers, not “copies” • Proposal: pass pointers into “init”, “run”, “final” and let user decide via dummy argument declaration • Destruction of ESMF objects • Need inquiry ability: who must delete this? • Need ability to say “it’s OK to let ESMF delete this” (delegation) • Must pass in pointers to “Create” or “Set”

  22. Questions?

More Related