1 / 24

Report on POP & CICE of RACM components

Report on POP & CICE of RACM components. Jaromir Jakacki, IO PAS Boulder, CO, 2010. Main tasks:. Extended domain Code was added in tag 11, was tested for functionality with compset H and grid ar9v2_ar9v2 Currently input file names for extended domain are hardwired in the code New grid

Download Presentation

Report on POP & CICE of RACM components

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Report on POP & CICE of RACM components Jaromir Jakacki, IO PAS Boulder, CO, 2010

  2. Main tasks: • Extended domain • Code was added in tag 11, was tested for functionality with compset H and grid ar9v2_ar9v2 • Currently input file names for extended domain are hardwired in the code • New grid • New grid is prepared but there is difference between domain error appeared between ice and ocean (both models read the same file (linked – not copied)) • Bathymetry for new grid (regular domain) and land mask (extended domain) are done • Extended domain climatology sst and sss data are prepared for 12 months • Mask between regular and extended domain is also done • Successful integration of 10 years of pop/cice/datm/slnd (in reality it is 14 years, but 4 with error)

  3. seaice at 2003-08-01_00:00:00 Old grid – area of T-points

  4. New grid - method Old grid north pole

  5. New grid - comparison

  6. New bathymetry and land mask

  7. Mask for merging domains

  8. Extended domain – interpolation data (at coupling time step) • Read in data (monthly SST, SSS (PHC 3.0) and mask) • Interpolation data in ocn_comp_mct.F90 at each coupling time step simple method – linear interpolation Y = AX+B For two months Y1=AX1+B and Y2=AX2+B A=(Y2-Y1)/(X2-X1) and B=y2-AX2 Y- output temperature or salinity 3) Merge interpolated data with ocean sea surface temperature

  9. Extended domain (current version has 72 points) – merged temperature and salinity (just for testing) SST_OUT=SST_CLIM*mask+SST_POP*(1-mask)

  10. Examples of interpolated on extended domain data: temperature and salinity

  11. Examples of results from ice component after 10 years integration

  12. Examples of results from ice component after 10 years integration

  13. Future work • Finishing work on new grid • Testing it on regular en extended domain • Checking new mask during integration • Beginning long term integration?? Thank You

  14. Interpolated and merged extended domain (2nd result) – sst POP and climatology sst (PHC)

  15. Interpolated extended domain (1st result)

  16. Extended domain • Monthly climatology data was interpolated for extended domain (temperature and salinity from PHC 3.0) • Some code was added to read those data and mask (ocn_init_mct routine) • Some code was added to interpolate and merge data with ocean domain at coupling time step (ocn_setdef_mct routine) • All was tested and seems to be working fine, only mask will require a little work

  17. Interpolated and merged extended domain (2nd result) – sss POP and climatology sss

  18. Examples of results from ocn and ice components

  19. Future plans • We are going to switch to other grid (will not exactly match with current) • Mask for merging sst and sss requires some work. But only mask. It is ready for running, mask can be replaced at any time • After switching to new grid we are able to begin integration - I am planning to begin to work with ocn/ice/wrf/slnd model using converted old ocean restart file

  20. New grid

  21. Important things • Net cdf in ocean model – optional compilation of the io_netcdf.F90, io.F90, io_types.F90 and io_binary.F90 • Diagnostic output in all models (performance) there is debbuging level from 0 to 4? (not sure), but there is a lot of write statements from all cores • Sometimes model hangs when after saving ocean restart file (I have no idea – there is no output – jobs are killed because of wall time) – about 20% of jobs • Computer resources – waiting time for one 256 processors job is about 2 days!!!!

  22. Performance chart for all models Average time [sec] Number of cores

  23. Code for linear interpolation (just for checking – maybe somebody will see mistake – not for presentation) data data_days /-14.5,15.5,45,74.5,105,135.5,166,196.5, & 227.5,258,288.5,319,349.5,380.5/ call mct_aVect_copy(avSST12,av_xp1,temp_flds(xp1),'fld') call mct_aVect_copy(avSST12,av_xp2,temp_flds(xp2),'fld') av_A%rAttr(1,:)=(av_xp2%rAttr(1,:)-av_xp1%rAttr(1,:))/(data_days(xp2)-data_days(xp1)) av_B%rAttr(1,:)=av_xp2%rAttr(1,:)-av_A%rAttr(1,:)*data_days(xp2) av_l_temp%rAttr(1,:)=av_A%rAttr(1,:)*rday+av_B%rAttr(1,:)!interpolation of salinity call mct_aVect_copy(avSSS12,av_xp1,salt_flds(xp1),'fld') call mct_aVect_copy(avSSS12,av_xp2,salt_flds(xp2),'fld') av_A%rAttr(1,:)=(av_xp2%rAttr(1,:)-av_xp1%rAttr(1,:))/(data_days(xp2)-data_days(xp1)) av_B%rAttr(1,:)=av_xp2%rAttr(1,:)-av_A%rAttr(1,:)*data_days(xp2) av_l_salt%rAttr(1,:)=av_A%rAttr(1,:)*rday+av_B%rAttr(1,:)

More Related