Earth system modeling framework overview
This presentation is the property of its rightful owner.
Sponsored Links
1 / 67

Earth System Modeling Framework Overview PowerPoint PPT Presentation


  • 94 Views
  • Uploaded on
  • Presentation posted in: General

Earth System Modeling Framework Overview. GFDL FMS Suite. MITgcm. NASA GSFC PSAS. NCEP Forecast. NSIPP Seasonal Forecast. NCAR/LANL CCSM. Chris Hill, MIT [email protected] TOMS, 2003, Boulder. Talk Outline. Project Overview Architecture and Current Status Superstructure layer Design

Download Presentation

Earth System Modeling Framework Overview

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Earth system modeling framework overview

Earth System Modeling Framework Overview

GFDL FMS Suite

MITgcm

NASA GSFC PSAS

NCEP Forecast

NSIPP Seasonal Forecast

NCAR/LANL CCSM

Chris Hill, MIT [email protected]

TOMS, 2003, Boulder


Talk outline

Talk Outline

  • Project Overview

  • Architecture and Current Status

    • Superstructure layer

      • Design

      • Adoption

    • Infrastructure layer

      • What is it for

      • What it contains

  • Next steps….

  • Open Discussion


Technological trends

Technological Trends

In climate research and NWP... increased emphasis on detailed representation of individual physical processes; requires many teams of specialists to contribute components to an overall modeling system.

In computing technology... increase in hardware and software complexity in high-performance computing, as we shift toward the use of scalable computing architectures.

Time mean air-sea CO2 flux.

MITgcm constrained by obs. + ocean carbon (MIT,Scripps,JPL).


Community response

Community Response

  • Modernization of modeling software

    Abstraction of underlying hardwareto provide uniform programming model, runs efficiently across vector and single and multiple microprocessor architectures.Distributed software development model characterized by many contributing authors; use high-level language features for abstraction, to facilitate development process and software sharing.Modular designfor interchangeable dynamical cores and physical parameterizations, development of community-wide standards for components

  • Development of prototype infrastructures

    GFDL (FMS), NASA/GSFC (GEMS), NCAR/NCEP (WRF), NCAR/DOE (MCT), MIT(Wrapper), ROMS/TOMS etc..

    ESMF aims to unify and extend these efforts.


Esmf goals and products

ESMF Goals and Products

STATED GOAL: To increase software reuse, interoperability, ease of use and performance portability in climate, weather, and data assimilation applications, implies unified “standards”.

PRODUCTS:

  • Coupling superstructure and utility infrastructure software

  • Synthetic code suite for validation and demonstration

  • Set of 15 ESMF-compliant applications (including CCSM, WRF, GFDL models; MITgcm, NCEP and NASA data assimilation systems)

  • Set of 8 interoperability experiments

Interoperability


Esmf interoperability demonstrations

ESMF Interoperability Demonstrations


Talk outline1

Talk Outline

  • Project Overview

  • Architecture and Current Status

    • Component based approach

    • Superstructure layer

      • Design

      • Adoption

    • Infrastructure layer

      • What is it for

      • What it contains

  • Next steps….

  • Open Discussion


Esmf overall structure

ESMF overall structure

  • ESMF uses a component based approach

  • The framework provides an upper “superstructure” layer and a lower “infrastructure” layer

  • User written code (simulation algorithms, DA algorithms …) is sandwiched between the two layers.

    • User code provides standard interfaces that are called from the superstructure layer

    • User code uses facilities in the infrastructure for parallelism, I/O, interpolation


Earth system modeling framework overview

ESMF Programming Model

1. ESMF provides an environment for assembling components.

Application Component

SUPERSTRUCTURE

LAYER

Gridded Components

Coupler Components

2. ESMF provides a toolkit that components use to

  • ensure interoperability

  • abstract common services

Component: run(), checkpoint()

Grid: regrid(), transpose() + Metrics

Field: halo(), import(), export() + I/O

INFRASTRUCTURE

LAYER

Layout, PEList, Machine Model

3. Gridded components, coupler components and application components are user written.


Superstructure layer assembles and connects components

Superstructure Layer: Assembles and connects components

Since each ESMF application is also a component, entire ESMF applications may be treated as Gridded Components and nested within larger applications.

climate_comp

Example: atmospheric

application containing multiple

coupled components within a

larger climate application

ocn2atm_coupler

ocn_comp

atm_comp

phys2dyn_coupler

atm_phys

atm_dyn

PE


Superstructure layer controlling subcomponents

Superstructure Layer:Controlling subcomponents

Components must provide a single externally visible entry point which will register the other entry points with the Framework. Components can:

- Register one or more Initialization, Run, Finalize, and Checkpoint entry points.

- Register a private data block which can contain all data associated with this instantiation of the Component; particularly useful when running ensembles.

Higher level Comp

ESMF Framework Services

cmp_final()

cmp_register()

cmp_run()

cmp_init()

Public subroutine

Private subroutine


Superstructure layer passing data between components i

Superstructure Layer:Passing data between components I

Gridded Components do not have access to the internals of other Gridded Components. They have 2 options for exchanging data with other Components; the first is to receive Import and Export States as arguments.

States contain flags for “is required”, “isvalid”, “is ready”, etc.

coupler

ocn_component

subroutine ocn_run(comp, &

ImportState,ExportState, Clock, rc)

atm_component

subroutine atm_run(comp, &

ImportState,ExportState, Clock, rc)


Earth system modeling framework overview

Superstructure Layer:Passing data between components II

Gridded Components using Transforms do not have to return control to a higher level component to send or receive State data from another component. They can receive function pointers which are methods that can be called on the states.

call ESMF_CompRun(atm, xform)

call ESMF_CompRun(ocn, xform)

transform

coupler

ocn_component

call ESMF_StateXform(ex_state, &

xform)

atm_component

call ESMF_StateXform(xform, &

im_state)


Superstructure layer parallel communication

Superstructure Layer: Parallel Communication

All inter-component communication within ESMF is local.

climate_comp

This means:Coupler Components must be defined on the union of the PEs of all the components that they couple.

In this example, in order to send data from the ocean component to the atmosphere, the Coupler mediates the send.

atm2ocn _coupler

ocn_comp

atm_comp

phys2dyn_coupler

atm_phys

atm_dyn

PE


Superstructure layer summary

Superstructure Layer: Summary

Provides means to connect components together

  • Components can be connected in a hierarchy

    Provides a general purpose mechanisms for passing

    data between components

  • Data is self-describing

    Provides a general purpose mechanism for “parent” components to control “child” components

  • Stepping forward, backward, initializing state etc…


Infrastructure layer

Infrastructure Layer

NCAR

Atmosphere

NCAR

Atmosphere

  • A standard software platform for enabling interoperability (developing couplers, ensuring performance portability).

  • Set of reusable software for Earth science applications. Streamlined development for researchers.

My

Sea Ice

GFDL

Ocean

GFDL

Ocean

NSIPP

Land

NSIPP

Land


Infrastructure layer scope

Infrastructure Layer Scope

ESMF Superstructure

Support for

  • Physical Grids

  • Regridding

  • Decomposition/composition

  • Communication

  • Calendar and Time

  • I/O

  • Logging and Profiling

User Code

ESMF Infrastructure


Field and grid

Field and Grid

:

grid = ESMF_GridCreate(…,layout,…)

:

field_u = ESMF_FieldCreate(grid, array)

:

ESMF_Field

metadata

ESMF_Grid

ESMF_Array

info.

  • Create field distributed over a set of decomposition elements (DE’s).

  • Domain decomposition determined by DELayout, layout.

  • Each object (grid and field_u) has internal representation.

  • Other parts of the infrastructure layer use the internal representation e.g.

    • Regrid() – interpolation/extrapolation + redistribute over DE’s

    • Redistribution() - general data rearrangement over DE’s

    • Halo() – specialized redistribution


Regrid

Regrid

  • Function mapping field’s array to a different physical and distributed grid.

  • RegridCreate() – creates a Regrid structure to be used/re-used

    regrid = ESMF_RegridCreate(src_field, dst_field, method, [name], [rc])

    Source, Dest field can be empty of field data (RegridCreate() uses grid metrics)

    • PhysGrid, DistGrid info used for setting up regrid

    • Resulting regrid can be used for other fields sharing same Grid

  • Method specifies interpolation algorithm. For example, bilinear, b-spline, etc…


Regrid interface cont

Regrid Interface (cont)

  • RegridRun() – performs actual regridding

    call ESMF_RegridRun(src_field, dst_field, regrid,[rc])

    • Communication and interpolation handled transparently.

  • RegridDestroy() – frees up memory

    call ESMF_RegridDestroy(regrid,[rc])

src_field

dst_field


Redistribution

src_field

Redistribution

  • No interpolation or extrapolation

  • Maps between distributed grids, field’s array and physical grid are the same i.e.

  • Example: layout() created two distributed grids one decomposed in X and one decomposed in Y.

    • Redistribution() function maps array data between the distributions (a transpose/corner turn)

    • Communication handled transparently

dst_field


Earth system modeling framework overview

Halo

  • Fields have distributed index spaces of

    • Exclusive {E}, compute {C} and local {L} region where

  • Halo() fills points not in {E} or {C} from remote {E} e.g

call ESMF_FieldHalo(field_foo, status)

{L}

{L}

{C}

{C}

{E}

{E}

DE4

DE3


More functions in reference manual e g

More functions in reference manual e.g


Bundle and location stream

Bundle and Location Stream

  • ESMF_Bundle

    • collection of fields on the same grid

  • ESMF_LocationStream

    • Like a field but…..

    • unstructured index space with an associated physical grid space

    • useful for observations e.g. radiosonde, floats

  • Functions for create(), regrid(), redistribute(), halo() etc…


Esmf infrastructure utilities

ESMF Infrastructure Utilities

  • Clock

  • Alarm

  • Calendar

  • I/O

  • Logging

  • Profiling

  • Attribute

  • Machine model

    and comms

Ensures consistent time between components

Provides field level I/O in standard forms – netCDF, binary, HDF, GRIB, Bufr

Consistent monitoring and messaging

Consistent parameter handling

Hardware and system software hiding. Platform customizable


Earth system modeling framework overview

Time

  • Standard type for any component

  • Calendar (support for range of calendars)

! initialize stop time to 13May2003, 2:00 pm

call ESMF_TimeInit(inject_stop_time, &

YR=int(2003,kind=ESMF_IKIND_I8), &

MM=off_month, DD=off_day, H=off_hour, M=off_min, &

S=int(0,kind=ESMF_IKIND_I8), &

cal=gregorianCalendar, rc=rc)

do while (currTime .le. inject_stop_time )

:

call ESMF_ClockAdvance(localclock, rc=rc)

call ESMF_ClockGetCurrTime(localclock, currtime, rc)

end

call ESMF_CalendarInit(gregorianCalendar, ESMF_CAL_GREGORIAN, rc)


Time representations

Time Representations

type(ESMF_Calendar) :: calendar1

call ESMF_CalendarInit(calendar1, ESMF_CAL_GREGORIAN, rc)

- ESMF_CAL_GREGORIAN (3/1/-4800 to 10/29/292,277,019,914)

- ESMF_CAL_JULIAN (+/- 106,751,991,167,300 days)

- ESMF_CAL_NOLEAP

- ESMF_CAL_360DAY

- ESMF_CAL_USER


Earth system modeling framework overview

I/O

  • Field level binary, netCDF, HDF, GRIB, bufr, extensible…

    • Currently I/O piped through 1 PE

      call ESMF_FieldAllGather(field_u, outarray, status)

      if (de_id .eq. 0) then

      write(filename, 20) "U_velocity", file_no

      call ESMF_ArrayWrite(outarray, filename=filename, rc=status)

      endif

      call ESMF_ArrayDestroy(outarray, status)


Internal classes

Internal Classes

  • Machine model

    • Captures system attributes, CPU, mem, connectivity graph

    • Useful for defining decomposition, load-balance, performance predictions.

  • Comms

    • Communication driver, allows bindings to MPI, shared memory, vendor system libraries


Comms performance test

Comms Performance Test

Right mix (green) on Compaq gives x2 realized bandwidth (in large message limit)


Talk outline2

Talk Outline

  • Project Overview

  • Architecture and Current Status

    • Superstructure layer

      • Design

      • Adoption

    • Infrastructure layer

      • What is it for

      • What it contains

  • Next steps….

  • Open Discussion


Timeline

Timeline


May 2003 release

May 2003 Release

Focus for May 2003 ESMF release was on developing sufficient infrastructure and superstructure to achieve the initial set of interoperability experiments.

These are:

  • FMS B-grid atmosphere coupled to MITgcm ocean

  • CAM atmosphere coupled to NCEP analysis

  • NSIPP atmosphere coupled to DAO analysis


Regrid next steps

Regrid Next Steps

  • Support for all ESMF Grids

  • Support for regridding methods including:

Bilinear

Bicubic

1st-order Conservative

2nd-order Conservative

Rasterized Conservative

Nearest-neighbor

distance-weighted average

Spectral transforms

1-d interpolations (splines)

Index-space (shifts, stencils)

Adjoints of many above


Distributed grid

Distributed Grid

  • Regular 2d already supported

  • Next steps

    • Generalized 1d decomposition

    • Extend support for 2d and quasi regular decomposition

    • Spectral grid decompositions

    • Composite grids

Physical Grid Next Steps

  • Larger set of metrics, grids

  • High level routines for rapid definition of common grids.


I o next steps

I/O Next Steps

  • Broaden format set, binary, netCDF, HDF, GRIB, bufr

  • Improve parallelization

  • Full support for alarms

  • Broader functionality

Time/Logging/Profiling Next Steps


Summary

Summary

  • ESMF Current Status

    • Comprehensive class structure available in version 1.0

    • Over the coming year significant extension of functionality will take place.

    • Feedback and comments on version 1.0 welcome

      http://www.esmf.ucar.edu


Talk outline3

Talk Outline

  • Project Overview

  • Architecture and Current Status

    • Superstructure layer

      • Design

      • Adoption

    • Infrastructure layer

      • What is it for

      • What it contains

  • Next steps….

  • Open Discussion


Questions from hernan

Questions from Hernan


Questions from hernan cont

Questions from Hernan, cont..


Questions from hernan cont1

Questions from Hernan, cont..


Questions from hernan cont2

Questions from Hernan, cont..


Earth system modeling framework overview

Last but not least - an interesting potential benefit of “component” based approaches

Component based approaches could provide a foundation for driving high-end applications from “desktop” productivity environments.

For example driving a parallel ensemble GCM run from Matlab becomes conceivable!

To learn more visit us at MIT!!!


Time manager

Time Manager

ESMF Infrastructure Utility detailed example

Earl Schwab, ESMF Core Team, NCAR


What is time manager

What is Time Manager?

  • Clock for time simulation

  • Time representation

  • Time calculator

  • Time comparisons

  • Time queries

  • F90 API, C++ implementation


Clock for time simulation

Clock for time simulation

type(ESMF_Clock) :: clock

call ESMF_ClockInit(clock, timeStep, startTime, stopTime, rc=rc)

do while (.not.ESMF_ClockIsStopTime(clock, rc))

! Do application work

.

.

.

call ESMF_ClockAdvance(clock, rc=rc)

end do


Clock cont

Clock (cont.)

  • Clock queries/commands

    call ESMF_ClockGetCurrTime(clock, currTime, rc)

    call ESMF_ClockSetCurrTime(clock, currTime, rc)

    call ESMF_ClockGetTimeStep(clock, timeStep, rc)

    call ESMF_ClockSetTimeStep(clock, timeStep, rc)

    call ESMF_ClockGetAdvanceCount(clock, advanceCount, rc)

    call ESMF_ClockGetStartTime(clock, startTime, rc)

    call ESMF_ClockGetStopTime(clock, stopTime, rc)

    call ESMF_ClockGetPrevTime(clock, prevTime, rc)

    call ESMF_ClockSyncToWallClock(clock, rc)


Esmf within the broader computational hierarchy

ESMF within the broader computational hierarchy

Earth System

Modeling Framework

Climate, Weather, Data

Assimilation

CCSM Coupled System,

NASA GEMS, GFDL FMS,

MITgcm, etc…

Single Earth system

model or modeling system

Common Component Architecture

All HPC Applications


Time representation

Time Representation

type(ESMF_Calendar) :: calendar1

call ESMF_CalendarInit(calendar1, ESMF_CAL_GREGORIAN, rc)

- ESMF_CAL_GREGORIAN (3/1/-4800 to 10/29/292,277,019,914)

- ESMF_CAL_JULIAN (+/- 106,751,991,167,300 days)

- ESMF_CAL_NOLEAP

- ESMF_CAL_360DAY

- ESMF_CAL_USER


Attributes

Attributes

  • Flexible parameter specs, configuration e.g. file of parameters

Code


Technological trends1

Technological Trends

In climate research and NWP...increased emphasis on detailed representation of individual physical processes; requires many teams of specialists to contribute components to an overall modeling system.

In computing technology... increase in hardware and software complexity in high-performance computing, as we shift toward the use of scalable computing architectures.

Time mean air-sea CO2 flux.

MITgcm constrained by obs. + ocean carbon (MIT,Scripps,JPL).


Time representation cont

Time Representation (cont.)

type(ESMF_Time) :: time1

call ESMF_TimeInit(time1, YR=int( 2003 ,kind=ESMF_IKIND_I8), &

MM= 5 , DD= 15 , H= 15 , cal=calendar1, rc=rc)

- YR, MM, DD, H, M, S F90 optional

- D, H, M, S arguments

type(ESMF_TimeInterval) :: timeInterval1

call ESMF_TimeIntervalInit(timeInterval1,

D=int( 90 ,kind=ESMF_IKIND_I8), rc=rc)

- D, H, M, S F90 optional arguments


Time calculator

Time Calculator

  • Time differencing

  • Time increment/decrement by a time interval

  • Time interval arithmetic (+, -, *, /)


Time calculator cont

Time Calculator (cont.)

call ESMF_TimeInit(time1, YR=int( 2003 ,kind=ESMF_IKIND_I8), &

MM= 5 , DD= 15 , cal=gregorianCalendar, rc=rc)

call ESMF_TimeInit(time2, YR=int( 2003 ,kind=ESMF_IKIND_I8), &

MM= 3 , DD= 26 , cal=gregorianCalendar, rc=rc)

call ESMF_TimeIntervalInit(timeInterval1,

D=int( 90 ,kind=ESMF_IKIND_I8, rc=rc)

timeInterval2 = time2 - time1

time1 = time1 + timeInterval1 ! Uses F90 overloaded

timeInterval3 = timeInterval1 * 2 ! operators

double precision :: ratio

ratio = timeInterval1 / timeInterval2


Time comparisons

Time Comparisons

>, <, >=, <=, ==, <> F90 overloaded operators between any

2 times or time intervals

if (time1 < time2) then

end if

if (timeInterval1 .ge. timeInterval2) then

end if


Time queries

Time Queries

call ESMF_TimeGet(time1, YR=yr, MM=mm, DD=dd, H=h, M=m, S=s, rc=rc)

call ESMF_TimeIntervalGet(timeInterval1, D=d, H=h, M=m, S=s)

call ESMF_TimeGetDayOfYear(time1, dayOfYear, rc)! double or integer

call ESMF_TimeGetDayOfMonth(time1, dayOfMonth, rc)

call ESMF_TimeGetDayOfWeek(time1, dayOfWeek, rc)

call ESMF_TimeGetMidMonth(time1, midMonth, rc)

call ESMF_TimeGetString(time1, string, rc) ! 2003-05-14T12:20:19 (ISO 8601)

call ESMF_TimeIntervalGetString(timeInterval1, string, rc) ! P1DT12H0M0S (ISO)

call ESMF_TimeGetRealTime(time1, rc)


More information

More information

  • ESMF User’s Guide

  • ESMF Reference Manual

  • ESMF Requirements Document

  • Time Manager F90 API & examples source code

    esmf_1_0_0_r/src/Infrastructure/TimeMgr/interface

    esmf_1_0_0_r/src/Infrastructure/TimeMgr/examples


Earth system modeling framework overview

Location Within ESMF

1. ESMF provides an environment for assembling components.

Application Component

Gridded Components

Coupler Components

2. ESMF provides a toolkit that components use to

  • ensure interoperability

  • abstract common services

Component: run(), checkpoint()

INFRASTRUCTURE

LAYER

Grid: regrid(), transpose() + Metrics

Field: halo(), import(), export() + I/O

Layout, PEList, Machine Model


Esmf infrastructure fields and grids

ESMF Infrastructure Fields and Grids


Creating an esmf field

Creating an ESMF Field

  • Fortran Array

  • Array Attach

  • Field Attach

    Array

real, dimension(100,100) :: arr

ESMF_Array

esArr = ESMF_ArrayCreate(arr,…)

info.

ESMF_Field

metadata

ESMF_FieldAttachArray(esFld,esArr,…)

ESMF_Grid

ESMF_Array

info.


Creating an esmf grid

Creating an ESMF Grid

call ESMF_DistGridCreate…(…)

call ESMF_PhysGridCreate…(…)

DistGrid

PhysGrid

ESMF_Grid


Infrastructure internal organization

Bundle

Regrid

Field

Grid

PhysGrid

DistGrid

F90

Data

Communications

C++

DELayout

Array

Route

MachineModel

Comm

Time, Alarm, Calendar, LogErr, I/O, Attributes

Infrastructure Internal Organization

Two tiers

Class hierarchy for data and communications


  • Login