1 / 19

Adoption and field tests of M.I.T General Circulation Model (MITgcm) with ESMF

Adoption and field tests of M.I.T General Circulation Model (MITgcm) with ESMF. Chris Hill ESMF Community Meeting MIT, July 2005. Outline. MITgcm – a very quick overview algorithmic features software characteristics Adopting ESMF strategy steps Field test applications

Download Presentation

Adoption and field tests of M.I.T General Circulation Model (MITgcm) with ESMF

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Adoption and field tests of M.I.T General Circulation Model (MITgcm) with ESMF Chris Hill ESMF Community Meeting MIT, July 2005

  2. Outline • MITgcm – a very quick overview • algorithmic features • software characteristics • Adopting ESMF • strategy • steps • Field test applications • MITgcm coupling with everything (including itself!) - interoperating with NCAR, GFDL and UCLA atmosphere models, intermediate complexity coupled system. • high-end parameterization as a coupled problem. • Next steps

  3. ~1000 km ~100 km ~10 km ~1 km ~100 m ~20 m MITgcm algorithmic characteristics • General orthogonal curvilinear coordinate finite-volume dynamical kernel. • Flexible, scalable domain decomposition 1CPU  2000+ CPU’s. • Can apply to wide range of scales, hydrostatic  non-hydrostatic. • Pressure-height isomorphism allows kernel to apply to ocean or atmosphere. • Many optional packages spanning biogeochemistry, atmospheric physics, boundary layers, sea-ice etc… • Adjoints to most parts for assimilation/state-estimation and sensitivity analysis. and more…. see http://mitgcm.org HYDROSTATIC NON-HYDROSTATIC

  4. MITgcm software characteristics • Fortran (what else ) • Approx 170K executable statements • Generic driver code (superstructure), coupling code, computational kernel code and parallelism, I/O etc… support code (infrastructure) are modularized  aligns with ESMF’s “sandwich” architecture. • Target hardware – my laptop to largest supercomputers (Columbia, Blue Genes)  it tries to be portable! • OSes - linux, HPUX, Solaris, AIX etc… • Parallel - MPI parallelism binding, threads parallelism binding (dormant), platform specific parallelism library support e.g active messages, shmem (dormant). • Distributed openly on web. Supported through user+developer mailing list, website. Users all over the world.

  5. Outline • MITgcm – a very quick overview • algorithmic features • software characteristics • Adopting ESMF • strategy • steps • Field test applications • MITgcm coupling with everything (including itself!) - interoperating with NCAR, GFDL and UCLA atmosphere models, intermediate complexity coupled system. • high-end parameterization as a coupled problem. • Next steps

  6. Adoption strategy • Currently only in-house (i.e. ESMF binding not part of default distribution). Practical consideration as many MITgcm user systems do not have ESMF installed. • Set of ESMF experiments maintained in MITgcm CVS source repository that we keep up to date with latest ESMF (with one/two week lag). • These experiments use • ESMF component model ( init(), run(), finalize() ) • Clocks, configuration attributes, field communications • Primarily sequential mode component execution (more on this later)

  7. Adoption steps – top level • Introduction of internal init(), run(), finalize(). • Development of couplers (and stub components to test against) • coupler_init(), coupler_run() • Development of drivers • driver_run(), driver_init() • Code can be seen under CVS repository at mitgcm.org. “MITgcm_contrib/ESMF”

  8. Outline • MITgcm very quick overview • algorithmic features • software characteristics • Adopting ESMF • strategy • steps • Field test applications • MITgcm coupling with everything (including itself!) - interoperating with NCAR, GFDL and UCLA atmosphere models. • high-end parameterization as a coupled problem. • Next steps

  9. Field test: M.I.T. General Circulation Model (MITgcm) to NCAR Community Atmospheric Model (CAM). • Versions of CAM and MITgcm were adapted to • have init(), run(), finalize() interfaces • accept, encode and decode ESMF_state variables • A coupler component that maps MITgcm grid to CAM grid was written Kluzek, Hill 180x90 on 1x16 PE’s 128x64 on 1x16 PE’s Runtime steps MITgcm prepares export state. Export state passes through parent to coupler Coupler returns CAM gridded SST array which is passed as import state to CAM gridded component. 1 2 3 • Uses ESMF_GridComp, ESMF_CplComp and ESMF_Regrid sets of functions.

  10. Field test: M.I.T. General Circulation Model (MITgcm) to GFDL Atmosphere/Land/Ice (ALI). • Versions of MOM and MITgcm were adapted to components • work within init(), run(), finalize() interfaces • accept, encode and decode ESMF_state variables • A coupler component that maps MITgcm grid to ALI grid was written • MITgcm component is substituted for MOM component with MITgcm-ALI coupler Smithline, Zhou, Hill 144x90 on 16x1 PE’s 128x60 on 1x16 PE’s Runtime steps MITgcm prepares export state. Export state passes through parent to coupler Coupler returns ALI gridded SST array which is passed to ALI. 1 2 3 • Uses ESMF_GridComp, ESMF_CplComp and ESMF_Regrid sets of functions.

  11. SI experiment: M.I.T. General Circulation Model (MITgcm) ECCO assimilation ocean and POP to UCLA atmosphere. Obs. analysis 3 mo. forecast A 3 mo. forecast B • Uses ESMF_GridComp, ESMF_CplComp and ESMF_Regrid sets of functions.

  12. New app: High-end resolution embedding as a coupled problem. For a climate related ocean simulation domain decomposition is limited on the number of processors it can usefully scale to. For ~1O model maybe no scaling beyond ~64 cpu’s This limit is because parallelism costs (comm overhead, overlap computations) exceed parallelism benefits. 0 1 2 5 3 4 6 7 Question: Are there other things beside ensembles of runs we can do with a thousand+ processor system? Increasing resolution is hard because explicit scheme timesteps drop with resolution – not good for millenial simulations.

  13. New app: High-end resolution embedding as a coupled problem. What about embedding local sub-models, running concurrently on separate processors but coupled to coarse resolution run. 65 66 0 1 2 5 67 3 4 6 7 68 319

  14. New app: High-end resolution embedding as a coupled problem. What about embedding local sub-models, running concurrently on separate processors but coupled to coarse resolution run. 65 66 0 1 2 5 67 3 4 6 7 68 319

  15. Implementation with ESMF • ESMF provides nice tools for developing this embedded system • component model abstraction for managing different pieces • parallel regrid/redist provides great tool for N to M coupling. regrid()/redist() precompute data flows at initialization. At each timestep resolving data transport between ~300-400 components is about 15 lines of user code. Top component 0 1 2 5 3 4 6 7 316 sub-components 64 …… 317 65 318 66 319 sub-sub-components 67

  16. MITgcm with ESMF next steps • Continued work in house. Directions • Embedding with dynamic balancing • High-resolution ocean and coupled work • ESMF in default MITgcm distribution • Most MITgcm user systems do not have ESMF installed yet. This will take time to change – how long? • Hopeful that in the next year this will evolve.

  17. Summary • ESMF implementation functionality has grown significantly over last year • optimized regrid/redist scaling • concurrent components • Performance is always within factor of 2 of custom code at infrastructure level, at superstructure (code driver, coupling) ESMF overhead is comparable to our own code.

  18. coupler_init() • coupler_run() BACK

  19. driver_init() • driver_run() BACK

More Related