1 / 48

Grid refinement in ROMS

Grid refinement in ROMS. 200 m. 1000 m. 5000 m. Two-way grid refinement. SST for: USeast grid = 5 km Carolinas grid = 1 km. Grid refinement of the ocean and wave models is required to allow increased resolution in coastal areas. Parent grid. just look at rho points for now. M. Mm. 1.

aneko
Download Presentation

Grid refinement in ROMS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Grid refinement in ROMS 200 m 1000 m 5000 m

  2. Two-way grid refinement SST for: USeastgrid = 5 km Carolinas grid = 1 km Grid refinement of the ocean and wave models is required to allow increased resolution in coastal areas.

  3. Parent grid just look at rho points for now M Mm 1 0 L Lm 0 1

  4. Child grid connectivity (for Nref=5) Parent grid rho points Lower left and Upper right parent psi points to identify child region (create_nested_grid.m) M+2 M Mm Parent - child interface 1 Child grid rho points 0 4 rho points to left and bottom 3 rho points to right and top -3 1 Lm 0 L L+2 -3 Inner child region

  5. Time stepping initial.F ! Read in initial conditions from initial NetCDF file. ! CALL get_state (ng, iNLM, 1, INIname(ng), IniRec, Tindex) …… #ifdef REFINED_GRID !----------------------------------------------------------------------- ! Compute indices of children grid locations in the parent grid. ! For 2-way, this needs to be done for all grids, not just ng>1. !----------------------------------------------------------------------- IF (ng.lt.NestedGrids) THEN CALL init_child_hindices (ng, TILE) END IF IF (ng.gt.1) THEN CALL init_parent_hindices (ng, TILE) END IF !----------------------------------------------------------------------- ! Obtain initial boudnary conditions from the parent data. !----------------------------------------------------------------------- IF (ng.gt.1) THEN CALL get_2dparent_data (ng, TILE) # ifdef SOLVE3D CALL get_3dparent_data (ng, TILE) # endif END IF #endif main3d.F (dtp and dtc) # ifdef REFINED_GRID ! get data from parent grid at time of parent. Only need ! to do this once per child loop. IF (ng.gt.1) THEN IF (get_refdata(ng).eq.TRUE.) THEN CALL get_2dparent_data (ng, TILE) CALL get_3dparent_data (ng, TILE) END IF END IF ! Put the child data back into the parent grid. IF ((ng.lt.Ngrids).and.(iic(ng).gt.1)) THEN CALL step3dref_t (ng, TILE) CALL set_2dchild_data (ng, TILE) CALL set_depth (ng, TILE) CALL set_3dchild_data (ng, TILE) END IF # endif … # ifdef REFINED_GRID ! interpolate the parent data to child time. IF (ng.gt.1) THEN CALL set_2dparent_data (ng, TILE) CALL set_3dparent_data (ng, TILE) END IF # endif ………… mian model time step computations ….. 1 4 6 1 5 2 iic=0 iic=1 iic=2 3 3 3 3 dtp 1 6 6 6 Parent (grid 1) …….. dtc Child (grid 2) 1 5 5 5 5 4 5 5 5 5 4 5 5 5 5 4 5 5 5 2

  6. How to prepare a refined grid application for ROMS • 1) Create child grid • 2) interp better bathy to child grid • 3) Match parent - child bathy + mask • 4) 3D init and climatology • 5) Surface forcings • 6) roms input file • 7) coawst.bash • 8) run it

  7. 1) create child grid Tools/mfiles/mtools/create_nested_grid.m • 1) enter parent (coarse) grid file • 2) enter child (fine) grid file • 3) Istr, Jstr, Iend, Jend • 4) scale (3 or 5) • 5) set create_child_grid=1 • this calls parentchild_grid.m

  8. 2) interp better bathy to child grid • The bathy in the grid you just created is from the parent. You need to get bathy from somewhere (see discussion earlier today). Need to do the smoothing also.

  9. 3) Match parent - child bathy + mask Tools/mfiles/ mtools/create_nested_grid.m (again) • You might want to make copies of • the grids before you run this again, as this will modify the masking and bathy in both files. • Run this again, but this time set : • merge_par_child_batrhy=1 • (this calls parentchild_bathy.m) • merge_par_child_mask=1 • (this calls parentchild_mask.m)

  10. 4) 3D init and climatology COAWST/Tools/mfiles/roms_clm/roms_master_climatology_coawst_mw.m 5) Surface forcings Can use same surface forcing files as parent (or make new ones)

  11. 6) roms input file this happens to be for a 5 grid application First grid is Lm is 2 points less than total length For all other grids, Lm is 7 points less than total length. Same for Mm. N and Nbed and tracers are same for all grids can tile differently, but need same total for each grid time step needs to divide evenly into parent

  12. 6) roms input file

  13. 6) roms input file

  14. 6) roms input file

  15. 6) roms input file

  16. 6) roms input file

  17. 7) coawst.bash Set NestedGrids to be the TOTAL number of grids

  18. 8) run it Set np = number of procs, same for each grid

  19. - How does the coupled modeling system work?and- Setting up a coupled application

  20. Coupled Modeling System Model Coupling Toolkit Mathematics and Computer Science Division Argonne National Laboratory http://www-unix.mcs.anl.gov/mct/ MCT is an open-source package that provides MPI based communications between all nodes of a distributed memory modeling component system. Download and compile as libraries that are linked to. Model A running on M nodes. Model B running on N nodes. Model C ……… MCT provides communications between all models. ……… (it also works here) Warner, J.C., Perlin, N., and Skyllingstad, E. (2008). Using the Model Coupling Toolkit to couple earth system models. Environmental Modeling and Software

  21. Libraries • MCT - v2.60 or higher (distributed) 1) cd to the MCT dir 2) ./configure This makes Makefile.conf. you can edit this file. 3) make 4) make install 5) set environment vars setenv MCT_INCDIR COAWST/Lib/MCT/include setenv MCT_LIBDIR COAWST/Lib/MCT/lib (or where ever you installed them, see last slide)

  22. Compilers dir (side note)

  23. Model organization master.F mpi_init init_file (# procs/model) { init run finalize SWAN { init run finalize ROMS

  24. init, run, and finalize ROMS SWAN init_param init_parallel init_scaclars init_coupling MPI_INIT init (grid decomp) roms_init SWINIT SWREAD (grid) init_coupling SWINITMPI run (sync. point) main3d ..... waves_coupling ... swanmain ..... ocean_coupling ... roms_run SWMAIN mpi_finalize close_io roms_finalize finalize SWEXITMPI mpi_finalize close_io

  25. Grid decomposition (during initialization) SWAN ROMS • Each tile is on a • separate processor. • Each tile registers • with MCT.

  26. init_coupling ROMS- init_coupling SWAN- init_coupling 1 1 2 2 3 3 processed by each ROMS tile processed by each SWAN tile

  27. Synchronization (run phase) ROMS- ocean_oupling SWAN- waves_coupling MCT MCT processed by each ROMS tile processed by each SWAN tile

  28. Let's look at the fields exchanged between models.

  29. ATM - OCN interactions or #define ATM2OCN_FLUXES #define BULK_FLUXES Use momentum + heat fluxes computed in WRF for both ROMS+WRF Salt flux Use wrfvars in COARE algorithm #define EMINUSP #define ATM_PRESS - Patm Uwind, Vwind Swrad, Lwrad, RH, Tair, cloud rain, evap stflx_salt = evap - rain Ustress, Vstress, Swrad, Lwrad LH, HFX LH + HFX computed in bulk_fluxes ATM Uwind, Vwind, Patm, RH, Tair, cloud, rain, evap, SWrad, Lwrad LH, HFX, Ustress, Vstress stflx_temp = Swrad+Lwrad +LH+HFX OCN Integration and Application Network (ian.umces.edu/symbols), University of Maryland Center for Environmental Science.

  30. ATM interactions ATM Hwave, Lpwave, Tpsurf, SST OCN WAV OCN SST Momentum Heat Surface fluxes Moisture = f ( Hwave, Lpwave, Tpsurf ) WAV

  31. How to create coupled application • 1) Create all input, BC, init, forcing, etc files for each model as if running separately. I recommend that you run each model separately first. • 2) modify cppdefs in your header file. • 3) SCRIP (if different grids) • 4) coupling.in • 5) coawst.bash • 6) run it as coawstM

  32. 1) Use each model separately • WRF • 27 vertical levels • dt 36 s • Physics • Lin microphysics • RRTM longwave, Dudhia shortwave • Mellor-Yamada-Janjic (MYJ) PBL • Kain-Fritsch (KF) cumulus scheme • ROMS • 16 vertical levels • dt 240, 48 • Physics • GLS turbulence closure • COARE bulk fluxes • BC's from HYCOM • Timestep = 240s 6 km grid 5km and 1 km grid(s) These models are on different grids.

  33. 2) south_car.h

  34. 3) SCRIP - grid interpolation http://climate.lanl.gov/Software/SCRIP/ Ocean grid 5 km Atm Grid 6 km 10 GFS data HFLX SST Ocean model provides higher resolution and coupled response of SST to atmosphere. But the ocean grid is limited in spatial coverage so atmosphere model must combine data from different sources, which can create a discontinuity in the forcing. Atmosphere model provides heat flux to cover entire ocean grid. SCRIP interpolations weights needed to remap data fields. Flux conservative remapping scheme

  35. Libraries • SCRIP - v 1.6 (distributed) Used when 2 or more models are not on the same grid. 1) cd to COAWST/Lib/SCRIP/source dir 2) edit makefile 3) make

  36. 3) SCRIP Need to prepare SCRIP input files by first converting ROMS and WRF grids to a standard netcdf file type that SCRIP likes. COAWST\Tools\mfiles\mtools\scrip_wrf.m

  37. 3) SCRIP Need to prepare SCRIP input files by first converting ROMS and WRF grids to a standard netcdf file type that SCRIP likes. COAWST\Tools\mfiles\mtools\scrip_roms.m

  38. 3) SCRIP SCRIP input file: scrip_in grid1_file and grid2_file were created with the matlab m files on the last 2 slides interp_file1 and interp_file2 will be the new scrip interpolation weights files Need to use conservative and fracarea !! run the program as ./scrip

  39. 3) SCRIP Need to run SCRIP for each grid pair. So if you have 1 WRF grid, driving 2 ROMS grids then you need 2 sets of weights. 1000 m 5000 m ROMS grid 1 ROMS grid 2 WRF grid 1

  40. 4) coupling.in (this is a ROMS+WRF app) set # procs for each model (total = 56) set coupling interval. for now leave it the same for all models. input file names. only 1 for WRF, 1 for ROMS, multiple for SWAN SCRIP weights are listed here set which WRF grid to couple to

  41. 4a) ocean.in set # procs for ocean model had listed 20 in coupling.in 5 x 4 = 20 need dt of 240 to divide evenly into coupling interval of 1200 sec.

  42. 4b) namelist.input need dt of 30 to divide evenly into coupling interval of 1200 sec. set # procs for atm model had listed 36 in coupling.in 6 x 6 = 36

  43. 5) coawst.bash set # nested roms / swan grids, app name, etc

  44. 6) run it as coawstM • use total number of procs from coupling.in • only 1 executable

  45. Processor allocation stdout reports processor allocation This looks like from a different run, but you get the idea

  46. Processor allocation "Timing for …." = WRF "1 179 52974 02:59:00 " = ROMS Here is where the model coupling synchronization occurs. so probably could re-allocate more nodes to WRF

  47. JOE_TC - test case examples JOE_TC test cases are distributed applications for testing ROMS+WRF coupling • JOE_TCw = wrf only • JOE_TCs = same grid, roms + wrf coupled • JOE_TCd = different grids for roms and wrf, needs scrip weights

More Related