1 / 19

TOPAZ operations, products, ongoing developments

TOPAZ operations, products, ongoing developments. Laurent Bertino, Knut Arild Lisæter, Goran Zangana, NERSC. OPNet meeting, Geilo, 6 th Nov. 2007. The TOPAZ model system. TOPAZ3: Atlantic and Arctic HYCOM EVP ice model coupled 11- 16 km resolution 22 hybrid layers EnKF 100 members

mayda
Download Presentation

TOPAZ operations, products, ongoing developments

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. TOPAZoperations, products, ongoing developments Laurent Bertino, Knut Arild Lisæter, Goran Zangana, NERSC OPNet meeting, Geilo, 6th Nov. 2007

  2. The TOPAZ model system • TOPAZ3: Atlantic and Arctic • HYCOM • EVP ice model coupled • 11- 16 km resolution • 22 hybrid layers • EnKF • 100 members • Sea Level Anomalies (CLS) • Sea Surface Temperatures • Sea Ice Concentrations (SSM/I) • Sea ice drift (CERSAT) • Runs weekly since Jan 2003 • ECMWF forcing (T799)

  3. Principle of the EnKF • To assimilate one observation one needs to know the error statistics: • Which model variables to update? • Over which area and depth? • We ignore most of that, but • We assume we know the sources of errors • We set arbitrary error statistics for them • Ensemble representation (emulation) of the errors • The error statistics and the impact of assimilation depend on our prior assumption on the error sources.

  4. Sequential data assimilationRecursive Monte Carlo method Forecast Analysis Member1 Member2 2 1 10d Forecast Member99 • Initial uncertainty • Model uncertainty • Measurement uncertainty Member100 3 Observations

  5. Tuesday Get ocean data + QC Start assimilation + QC Log status to text file Wednesday Start 10d forecast + QC Generate products Start ensemble forecast Friday Generate “best guess” products Cleanup Automatic (cron deamon) Download and convert ECMWF files (daily) Transfer files to OPeNDAP Backup files to archive E-mails standard output to the “calldesk” mersea-contact@nersc.no Operator’s week

  6. The State Space • 2D variables (800 x 880 grid cells) • Barotropic pressure, u/v velocity, ice concentration, ice thickness • 3D variables (800 x 880 x 22 grid cells) • Temperature, salinity, u/v current, layer thickness • TOTAL: n =81.000.000 state variables • 100 members in double precision = 60 Gb

  7. The observations • Sea level anomalies – SLA (satellite, radar altimeters): CLS • Non linear function of state variables • 100.000 observations every week • Sea-surface temperature – SST (satellite, optical): NOAA • 8.000 observations every week • Sea-ice concentrations (satellite, microwave): NSIDC • 40.000 observations every week • Sea-ice drift (AMSR-E, QuickSCAT): CERSAT/Ifremer • 80.000 observations every week • TOTAL: m=228.000 obs • Coming up: in-situ profiles (10.000 obs.), HR SST (120.000 obs.), HR ice conc. (160.000 obs.) …

  8. Propagation (HYCOM) Embarrassingly parallel 1 job per member 13 Gb 1 node x 1h 100x 16 CPU (4x8 OpenMP/MPI) SMT is used 1600 CPU hours / week Analysis (EnKF) Sequential, 3 datasets MPI parallelization needed on Njord (<13Gb constraint) 8 CPUs, 1:30 each dataset MPI-parallel output Post-processing jobs to assemble the EnKF output 16 CPUs, 30 min each dataset 72 CPU hours / week Computations2 recursive steps

  9. Ocean observations SSM/I NSIDC: 7 Mb per week Merged SLA maps Aviso: 3.8 Mb per week SST NOAA: 0.5 Mb per week OSTIA: 10 Mb In-situ Coriolis: 5-10 Mb per week Negligible download / processing time Less than a minute Atmospheric fields ECMWF T799 (1/4th deg) 1d analysis + 10d forecast 5 variables 1.2 Gb per day About 1h for download & pre-processing Potential issue for daily runs Incoming data volume

  10. The Products MERSEA standard products Forecast for the Tara expedition

  11. Conversion to NetCDF 1.2 Gb per week Past forecast overwritten by new best guess File server External to NERSC (Parallab) 90 Gb produced every year Limited by disk space MERSEA Product generation

  12. Class1 3D daily fields (U,V, T, S) 15 fixed depths 2D Surface fields (SSH, sea ice …) 25 km output grid Class2 2D daily sections Moorings Same variables as Class1 33 fixed depths Class3 Time series of integrated variables Water mass transports (Atlantic water …) Sea Ice transports Other: MOSF, ice area, volume Class4 Model-minus observations Coriolis T/S profiles SSM/I ice concentrations Next: Tide gauge, SST, ... MERSEA productsAll are derived from HYCOM daily averages Documented in MERSEA WP5

  13. Examples of output Sea-ice minimum 2007

  14. RMS errors - Example for the Barents Sea Ice • Analysis better than forecast • Forecast better than persistence • See http://topaz.nersc.no

  15. TOPAZ successive forecasts in red Actual positions of Tara from DAMOCLES in black Updated on Google Earth A non-MERSEA product [ K. A. Lisæter]

  16. Arctic sea-ice area minimum

  17. Forecasting the ice minimum in TOPAZ

  18. Conclusion • First operational application based on the EnKF inside Europe (so far only in USA and Canada) • Installed on Met.no’s facilities (Njord) • Code and restart files provided to met.no • Scheduling shell scripts • Self-documented, but documentation not up-to-date. • Some tasks can be done in parallel • Setting up the OPeNDAP/THREDDS • View the differences with NERSC TOPAZ on a LAS • Scheduling the jobs in the operational suite • Start from the main script and descend into dependencies

More Related