1 / 52

A Community T errain-following O cean M odeling S ystem

A Community T errain-following O cean M odeling S ystem. 2003 Terrain-Following Ocean Models Users Workshop PMEL, Seattle, WA, August 5, 2003. Developers and Collaborators. Hernan G. Arango Alexander F. Shchepetkin W. Paul Budgell Bruce D. Cornuelle Emanuele DiLorenzo Tal Ezer

lindsey
Download Presentation

A Community T errain-following O cean M odeling S ystem

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Community Terrain-following Ocean Modeling System 2003 Terrain-Following Ocean Models Users Workshop PMEL, Seattle, WA, August 5, 2003

  2. Developers and Collaborators Hernan G. Arango Alexander F. Shchepetkin W. Paul Budgell Bruce D. Cornuelle Emanuele DiLorenzo Tal Ezer Mark Hadfield Kate Hedstrom Robert Hetland John Klinck Arthur J. Miller Andrew M. Moore Christopher Sherwood Rich Signell John C. Warner John Wilkin Rutgers University UCLA IMR, Norway SIO SIO Princeton University NIWA, New Zealand University of Alaska, ARSC TAMU Old Dominion SIO University of Colorado USGS/WHOI SACLANT USGS/WHOI Rutgers University

  3. Executive Committee Rutgers University UCLA Stanford University Dale B. Haidvogel James C. McWilliams Robert Street ONR Support Manuel Fiadeiro Terri Paluszkiewicz Charles Linwood Vincent

  4. Objectives • To design, develop and test an expert ocean modeling system for scientific and operational applications over a wide range of scales from coastal to global • To provide a platform for coupling with operational atmospheric models, sediment models, and ecosystem models • To support multiple levels of nesting and composed grids • To provide tangent linear and adjoint models for variational data assimilation, ensemble forecasting, and stability analysis • To provide a framework for massive parallel computations

  5. Approach • Use state-of-the-art advances in numerical techniques, subgrid-scale parameterizations, data assimilation, nesting, computational performance and parallelization • Modular design with ROMS as a prototype • Test and evaluate the computational kernel and various algorithms and parameterizations • Build a suite of test cases and application databases • Provide a web-based support to the user community and a linkage to primary developers

  6. Accomplishments ROMS/TOMS 2.0 released to beta testers on January 16, 2003 and full user community on June 30, 2003. Built tangent linear and adjoint models and tested on realistic applications of the US West and East Coasts: eigenmodes and adjoint eigenmodes, singular vectors, pseudospectra, forcing singular vectors, stochastic optimals, and ensemble forecasting.

  7. Universities • Government Agencies • Research Organizations The model is used in oceanographic studies in over 30 countries by: (Relief Image from NOAA Animation by Rutgers)

  8. Ocean Modeling Web Site http://www.ocean-modeling.org/

  9. Kernel Attributes • Free-surface, hydrostatic, primitive equation model • Generalized, terrain-following vertical coordinates • Boundary-fitted, orthogonal curvilinear, horizontal coordinates on an Arakawa C-grid • Non-homogeneous predictor/corrector time-stepping algorithm • Accurate discretization of the baroclinic pressure gradient term • High-order advection schemes • Continuous, monotonic reconstruction of vertical gradients to maintain high-order accuracy

  10. Vieste (Italy) Dubrovnik (Croatia) Longitude Vertical Terrain-following Coordinates Depth (m)

  11. Curvilinear Transformation Cartesian Spherical Polar

  12. Code Design • Modular, efficient, and portable F90/F95 Fortran code with dynamical allocation of memory via de-referenced pointer structures. • C-preprocessing managing • Multiple levels of nesting and composed grids • Lateral boundary conditions options for closed, periodic, and radiation • Arbitrary number of tracers (active and passive) • Input and output NetCDF data structure • Support for parallel execution on both shared- and distributed -memory architectures

  13. Model Grid Configuration Composed Nested

  14. Parallel Framework • Coarse-grained parallelization

  15. Parallel Tile Partitions 8 x 8 Ny } } Nx

  16. Parallel Framework • Coarse-grained parallelization • Shared-memory, compiler depend directives MAIN (OpenMP 2.0 standard) • Distributed-memory (MPI) • Optimized for cache-bound computers • ZIG-ZAG cycling sequence of tile partitions • Few synchronization points • Serial and Parallel I/O (via NetCDF) • Efficiency 4-64 threads

  17. (Ezer)

  18. The cost of saving output and global averaging is much higher for the MPI code (for the shared-memory SGI machine) (Ezer)

  19. Subgrid-Scale Parameterizations • Horizontal mixing of tracers along level, geopotential, isopycnic surfaces • Transverse, isotropic stress tensor for momentum • Local, Mellor-Yamada, level 2.5, closure scheme • Non-local, K-profile, surface and bottom closure scheme • Local, Mellor-Yamada, level 2.5, closure scheme • General Length-Scale turbulence closure (GOTM)

  20. Boundary Layers • Air-Sea interaction boundary layer from COARE (Fairall et al., 1996) • Oceanic surface boundary layer (KPP; Large et al., 1994) • Oceanic bottom boundary layer (inverted KPP; Durski et al., 2001) • Wave / Current / Sediment bed boundary layer (Styles and Glenn, 2000; Blaas; Sherwood)

  21. Modules • Lagrangian Drifters (Klinck, Hadfield, Capet) • Tidal Forcing (Hetland, Signell) • River Runoff (Hetland, Signell, Geyer) • Sea-Ice (Budgell, Hedstrom) • Biology Fasham-type Model (Moisan, Di Lorenzo, Shchepetkin, Frenzel, Fennel, Wilkin) • EcoSim Bio-Optical Model (Bissett, Wilkin) • Sediment erosion, transport and deposition (Warner, Sherwood, Blaas)

  22. Ongoing and Future Work • One- and two-way nesting • Wetting and drying capabilities • Sediment model • Bottom boundary layer models • Ice model • Parallelization of adjoint model • Variational data assimilation • Parallel IO • Framework (ESMF) • Web-based dynamic documentation • Test cases • WRF coupling

  23. One-Way Nesting

  24. North Atlantic Basin • 1/10 degree resolution (1002x1026x30) • Levitus Climatology • NCEP daily winds: 1994-2000 • COADS monthly heat fluxes • Requirements: • Memory: 11 Gb • Input data disk space: 16 Gb • Ouput data disk space: 280 Gb • 32 Processors Origin 3800, 4x16 • CPU: 46 hours per day of simulation • Wall clock: 153 days for 7-year simulation

  25. Bathymetry (m) r-factor = 3.2 1/10 degree Resolution ETOPO5

  26. Free-Surface (m)

  27. Temperature at 100 m

  28. US East Coast • 30 km resolution (192x64x30) • Initialized for North Atlantic Basin Simulation • NCEP daily winds • COADS monthly heat fluxes with imposed daily shortwave radiation cycle. • One-way nesting • Boundary conditions from 3-day averages • Flather/Chapman OBC for 2D momentum • Clamped OBC for 3D momentum and tracers • Rivers • Fasham-type biology model

  29. Temperature at 100 m One-Way Coupling (Wilkin)

  30. Potential Temperature at 50 m (Celsius) 30 km Resolution (Wilkin)

  31. Surface Temperature Surface Chlorophyll

  32. Publications Ezer, T., H.G. Arango and A.F. Shchepetkin, 2002: Developments in Terrain-Following Ocean Models: Intercomparisons of Numerical Aspects, Ocean Modelling, 4, 249-267. Haidvogel, D.B., H.G. Arango, K. Hedstrom, A. Beckmann, P. Malanotte-Rizzoli, and A.F. Shchepetkin, 2000: Model Evaluation Experiments in the North Atlantic: Simulations in Nonlinear Terrain-Following Coordinates, Dyn. Atmos. Oceans, 32, 239-281. MacCready, P. and W.R. Geyer, 2001: Estuarine Salt Flux through an Isoline Surface, J. Geoph. Res., 106, 11629-11639. Malanotte-Rizzoli, P., K. Hedstrom, H.G. Arango, and D.B. Haidvogel, 2000: Water Mass Pathways Between the Subtropical and Tropical Ocean in a Climatological Simulation of the North Atlantic Ocean Circulation, Dyn. Atmos. Oceans, 32, 331-371. Marchesiello, P., J.C. McWilliams and A.F. Shchepetkin, 2003: Equilibrium Structure and Dynamics of the California Current System, J.Phys. Oceanogr., 34, 1-37. Marchesiello, P., J.C. McWilliams, and A.F. Shchepetkin, 2001: Open Boundary Conditions for Long-Term Integration of Regional Ocean Models, Ocean Modelling, 3, 1-20. Moore, A.M., H.G. Arango, A.J. Miller, B.D. Cornuelle, E. Di Lorenzo, and D.J. Neilson, 2003: A Comprehensive Ocean Prediction and Analysis System Based on the Tangent Linear and Adjoint Components of a Regional Ocean Model, Ocean Modelling, Submitted. Peven, P., C. Roy, A. Colin de Verdiere and J. Largier, 2000: Simulation and Quantification of a Coastal Jet Retention Process Using a Barotropic Model, Oceanol. Acta, 23, 615-634. Peven, P., J.R.E. Lutjeharms, P. Marchesiello, C. Roy and S.J. Weeks, 2001: Generation of Cyclonic Eddies by the Agulhas Current in the Lee of the Agulhas Bank, Geophys. Res. Let., 27, 1055-1058. Shchepetkin, A.F. and J.C. McWilliams, 2003: The Regional Ocean Modeling System: A Split-Explicit, Free-Surface Topography-Following Coordinates Ocean Model, J. Comp. Phys., Submitted. Shchepetkin, A.F. and J.C. McWilliams, 2003: A Method for Computing Horizontal Pressure-Gradient Force in an Oceanic Model with a Non-Aligned Vertical Coordinate, J. Geophys. Res., 108, 1-34. She, J. and J.M. Klinck, 2000: Flow Near Submarine Canyons Driven by Constant Winds, J. Geophys. Res., 105, 28671-28694. Warner, J.C., H.G. Arango, C. Sherwood, B. Butman, and Richard P. Signell, 2003: Performance of four turbulence closure methods Implemented using a Generic Length Scale Method, Ocean Modelling, Revised and Resubmitted.

  33. Modular Design

  34. Code Design

  35. #include "cppdefs.h“ MODULEmod_ocean USE mod_kinds implicitnone TYPET_OCEAN real(r8), pointer :: rubar(:,:,:) real(r8), pointer :: rvbar(:,:,:) real(r8), pointer :: rzeta(:,:,:) real(r8), pointer :: ubar(:,:,:) real(r8), pointer :: vbar(:,:,:) real(r8), pointer :: zeta(:,:,:) #ifdefSOLVE3D real(r8), pointer :: pden(:,:,:) real(r8), pointer :: rho(:,:,:) real(r8), pointer :: ru(:,:,:,:) real(r8), pointer :: rv(:,:,:,:) real(r8), pointer :: t(:,:,:,:,:) real(r8), pointer :: u(:,:,:,:) real(r8), pointer :: v(:,:,:,:) real(r8), pointer :: W(:,:,:) real(r8), pointer :: wvel(:,:,:) # ifdefSEDIMENT real(r8), pointer :: bed(:,:,:,:) real(r8), pointer :: bed_frac(:,:,:,:) real(r8), pointer :: bottom(:,:,:) # endif #endif END TYPET_OCEAN TYPE (T_OCEAN), allocatable :: ALL_OCEAN(:) END MODULEmod_ocean CONTAINS

  36. SUBROUTINE allocate_ocean(ng, LBi, UBi, LBj, UBj) USE mod_param #ifdefSEDIMENT USE mod_sediment #endif integer, intent(in) :: ng, LBi, UBi, LBj, UBj IF (ng.eq.1) allocate ( OCEAN(Ngrids) ) allocate ( OCEAN(ng) % rubar(LBi:UBi,LBj:UBj,2) ) allocate ( OCEAN(ng) % rvbar(LBi:UBi,LBj:UBj,2) ) allocate ( OCEAN(ng) % rzeta(LBi:UBi,LBj:UBj,2) ) allocate (OCEAN(ng) % ubar(LBi:UBi,LBj:UBj,3) ) allocate ( OCEAN(ng) % vbar(LBi:UBi,LBj:UBj,3) ) allocate ( OCEAN(ng) % zeta(LBi:UBi,LBj:UBj,3) ) #ifdefSOLVE3D allocate (OCEAN(ng) % pden(LBi:UBi,LBj:UBj,N(ng)) ) allocate ( OCEAN(ng) % rho(LBi:UBi,LBj:UBj,N(ng)) ) allocate (OCEAN(ng) % ru(LBi:UBi,LBj:UBj,0:N(ng),2) ) allocate ( OCEAN(ng) % rv(LBi:UBi,LBj:UBj,0:N(ng),2) ) allocate ( OCEAN(ng) % t(LBi:UBi,LBj:UBj,N(ng),3,NT(ng)) ) allocate (OCEAN(ng) % u(LBi:UBi,LBj:UBj,N(ng),2) ) allocate (OCEAN(ng) % v(LBi:UBi,LBj:UBj,N(ng),2) ) allocate ( OCEAN(ng) % W(LBi:UBi,LBj:UBj,0:N(ng)) ) # ifdefSEDIMENT allocate ( OCEAN(ng) % bed(LBi:UBi,LBj:UBj,Nbed,MBEDP) ) allocate ( OCEAN(ng) % bed_frac(LBi:UBi,LBj:UBj,Nbed,NST) ) allocate ( OCEAN(ng) % bottom(LBi:UBi,LBj:UBj,MBOTP) ) # endif #endif RETURN END SUBROUTINE allocate_ocean

  37. SUBROUTINE initialize_ocean(ng, tile) USE mod_param #ifdefSEDIMENT USE mod_sediment #endif integer, intent(in) ::ng, tile integer ::IstrR, IendR, JstrR, JendR, IstrU, JstrV real(r8), parameter ::IniVal = 0.0_r8 #include "tile.h" #ifdefDISTRIBUTE IstrR=LBi IendR=UBi JstrR=LBj JendR=UBj #else # include "set_bounds.h" #endif OCEAN(ng) %rubar(IstrR:IendR,JstrR:JendR,1:2) = IniVal OCEAN(ng) %rvbar(IstrR:IendR,JstrR:JendR,1:2) = IniVal OCEAN(ng) %rzeta(IstrR:IendR,JstrR:JendR,1:2) = IniVal OCEAN(ng) %ubar(IstrR:IendR,JstrR:JendR,1:3) = IniVal OCEAN(ng) %vbar(IstrR:IendR,JstrR:JendR,1:3) = IniVal OCEAN(ng) %zeta(IstrR:IendR,JstrR:JendR,1:3) = IniVal ... RETURN END SUBROUTINE initialize_ocean END MODULE mod_ocean

More Related