1 / 18

PT postprocessing Fieldextra

PT postprocessing Fieldextra. Jean-Marie Bettems 09.09.2010 COSMO GM, Moscow. Fieldextra – Identity card (1). Some examples of typical model pre-/post-processing tasks :

washi
Download Presentation

PT postprocessing Fieldextra

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. PT postprocessingFieldextra Jean-Marie Bettems 09.09.2010 COSMO GM, Moscow

  2. Fieldextra – Identity card (1) • Some examples of typical model pre-/post-processing tasks: • merge surface temperature from IFS over sea and from COSMO over land to produce a single field suited for the assimilation cycle • interpolate Swiss radar composite onto the COSMO-2 grid for feeding the latent heat nudging process • compute EPS probabilities from COSMO-LEPS members • compute neighbourhood probabilities from COSMO-7 • compute convective index like KO or CAPE • compute time serie of W_SO, stratified by soil type, region averaged, and for a set of standard depths • create a singleXML file with time serie of parameters from COSMO-2 / -7 / -EPS and IFS for a set of locations → use fieldextra !

  3. Fieldextra – Identity card (2) • Fortran program designed as ageneric tool to manipulate NWP model data and gridded observations • Build as a toolbox... • implement a set of primitive operations, which can be freely combined and iterated • controlled by Fortran namelists • File based input/output ... • support both GRIB1 and GRIB2(input/output) • support local extension of GRIB standard • understand naming conventions of COSMO output • rich set of output format in addition to GRIB (CSV, XML ...)

  4. Fieldextra – Identity card (3) • Primary focus is theproduction environment • high quality standard (code design, testing) • robust handling of exceptions • comprehensive diagnostic • optimized code: • input/output(read model output once, produce as many products as desired) • memory footprint • time criticality • inter-process communication (support parallel production suite)

  5. Fieldextra – Identity card (4) • More than 60k lines of Fortran 2003 • Linked with DWD grib library(GRIB1),ECMWF grib API (GRIB2), JasPer (JPEG in GRIB2) and some COSMO modules • Standalone package available on COSMO web site • Single threaded code • Portable code (used on SGI Origin, IBM Power, Cray Opteron …) • Documented code (examples, user manual, developer manual …) • Resources allocated at MeteoSwiss for further development

  6. Fieldextra – Identity card (5) • Core non-graphical NWP production tool at MeteoSwiss • Official COSMO post-processing software • Official tool for the EUMETNET SRNWP interoperabilityproject

  7. Activities since last COSMO GMSummary • Releases • 10.0.0 15 Nov. 2009 (first official COSMO release) • 10.1.0 31 Jan. 2010 (only intern MeteoSwiss) • 10.2.0 26 Aug. 2010 • Focus 10.0.0 – 10.2.0 • Improve user experience (package, namelist, diagnostic, ...) • Support for additional models besides COSMO • Robust implementation of GRIB 2 input and output

  8. Activities since last COSMO GMHighlights (1) • Create standalone distribution package • source code, incl. libraries • resource files (dictionary, location ...) • examples, with reference results • documentation (installation, user, developer) • support for PathScale and GNU compiler • Support of GRIB2 for input and output data • support product templates 0 & 8 (deterministic), 1 & 11 (EPS member), 2 & 12 (EPS derived), 5 & 9 (probability) • support both simple and JPEG data packing • based on ECMWF GRIB API v1.9.0 • on a single core as efficient as the DWD grib library for GRIB1 • still missing features (local section, kilometric grid ...)

  9. Activities since last COSMO GMHighlights (2) • Extend set of recognized products(model, product category, vertical coord., EPS information) gmeifscosmo ------------------------------------------------------------- determinist YES YES YES eps member NA NO YES (1) eps mean NA NO YES (2) eps median NA NO YES (2) eps probability NA NO YES (2) neighb. probability YES YES YES -------------------------------------------------------------- (1) both COSMO LEPS and COSMO DE EPS suites (2) only COSMO LEPS • Operators to compute HFL and HHL by integrating the hydrostatic equation (IFS, GME)

  10. Activities since last COSMO GMHighlights (3) • Other new features • localisation of operators (fxtr_operator_generic, fxtr_operator_specific) and of output routines (fxtr_write_generic, fxtr_write_specific) • new operator to interpolate model fields from half to full levels • introduce strict_usage in &RunSpecification • introduce soft_memory_limit in &GlobalSettings • implement flag-file based inter process communication(force stop, ready files) • and more ... • Bug correction, code clean-up, memory optimization • See the HISTORY file for the complete list of modifications(pay attention to the ... ATTENTION section!)

  11. Priority task postprocessingDescription of individual sub-tasks (2010 – 2011) • [EXTENDED] Support usage outside of MeteoSwiss [0.05 FTE, MCH resources]> support for initial installation at interested centers, bug fixes > organize one day tutorial at DWD • [EXTENDED] Support of GME and ICON on native grid[??? FTE, no resources available]> utility routines to work with GME/ICON grids are available in INT2LM> fieldextra currently assumes a regular grid, this should be relaxed • [DELAYED, MODIFIED] Implement NetCDF support (out)[0.05 FTE, MCH resources]> use support from ECMWF GRIB API to implement NetCDF output>necessity to implement NetCDF input has to be re-evaluated • [DELAYED] Consolidate interface COSMO/fieldextra (fieldextra code)[0.015 FTE, MCH resources]> use latest COSMO module, support new reference atmosphere

  12. Priority task postprocessingDescription of individaual sub-tasks (2010 – 2011) • [DELAYED] Provide a common library of modules for physical and mathematical constants, meteorological functions, etc. [0.2 FTE, DWD resources] • [DELAYED] Extend fieldextra functionality[0.075 FTE, MCH resources; ??? FTE from other centres]> review all formulas in fieldextra and use common library when possible > implement vorticity on p-surfaces using metric terms from COSMO> implement parameters requested by COSMO members • [NEW] Consolidate GRIB2 implementation[0.15 FTE, MCH resources]> further adapt internal code structure (e.g. level representation)> implement additional features (e.g. product template for radar)> coordinate usage within COSMO (short names, local tables, local section, local usage) • [NEW] Consolidate documentation[0.075 FTE, MCH resources; 0.05 FTE DWD resources]> finalize developer documentation> format in LaTeX

  13. Roadmap (1) December 2010 – Version 10.2.1 (intern MeteoSwiss) → Needs MeteoSwiss / Vorticity as showcase for use of metric terms • Kalman corrections • Support latest SLEVE coordinates • Consolidate interface COSMO/fieldextra(latest COSMO code, new ref. atmosphere, vorticity)

  14. Roadmap (2) March 2011 – Version 10.3 → Consolidate code / NetCDF & GRIB2 / SRNWP interoperability • Internal code improvements • Finalize support for SRNWP interoperability(incl. support of additional grids if necessary) • use other SRNWP LAM to generate MeteoSwiss products • generate SRNWP inetroperability file from COSMO output • Consolidate GRIB2 implementation(treatment of level, missing features, ...) • Coordinate usage of GRIB2 within COSMO • NetCDF output

  15. Roadmap (3) September 2011 – Version 11.0 → COSMO priority task • Graceful handling of missing input files in a temporal serie • Improve developer documentation • Consolidate interface COSMO/fieldextra(use COSMO common library when possible)

  16. input 1 Output 1 input 2 Output 2 Output 3 input 3 Output 4 High Performance Fieldextra • From sequential to parallel processing of output • output processing means computation of user defined operations and production of output file • potential important performance gain when working in post-processing mode with many output, scalability with respect to the number of products • load balance achieved by distributing processing according to a pre-computed profiling (post-processing production is fairly static) • Direct in-memory communication between COSMO and fieldextra • i/o (disk access) will become more and more a bottle neck, in particular for EPS systems • live visualization methods (Visit) exist which use such an approach, CSCS is willing to help for this task

  17. Sharing development effort → Resources at Meteoswiss for fieldextra development are limited! → Quality standard must be maintained (production tool)! • Implementing new operators or new types of output is reasonably easy, and should be done by the interested center(mail support from MeteoSwiss can be expected) • output format fxtr_write_specific fxtr_write_generic (general interest) • operator fxtr_operator_specific fxtr_operator_generic (general interest) • MeteoSwiss resources should be reserved for changes requiring a deep knowledge of the code (or for own needs).

  18. !+****************************************************************************!+**************************************************************************** SUBROUTINE generate_output(multi_pass_mode, just_on_time, last_call, & datacache, data_origin, tot_nbr_input, & out_paths, out_types, out_modes, & out_grib_keys, out_spatial_filters, & out_subset_size, out_subdomain, out_gplist, out_loclist, & out_data_reduction, out_postproc_modules, & nbr_gfield_spec, gen_spec, ierr, errmsg ) !============================================================================= ! ! Root procedure to generate output files ! !------------------------------------------------------------------------------ ! Dummy arguments LOGICAL, INTENT(IN) :: multi_pass_mode ! Multiple pass mode? LOGICAL, DIMENSION(:), INTENT(IN) :: just_on_time ! True if prod. now LOGICAL, INTENT(IN) :: last_call ! True if last call CHARACTER(LEN=*), INTENT(IN) :: datacache ! Data cache file TYPE(ty_fld_orig), INTENT(IN) :: data_origin ! Data origin INTEGER, DIMENSION(:), INTENT(IN) :: tot_nbr_input ! Expected nbr. input CHARACTER(LEN=*), DIMENSION(:), INTENT(IN) :: out_paths ! Output files names TYPE(ty_out_spec), DIMENSION(:), INTENT(IN) :: out_types ! types TYPE(ty_out_mode), DIMENSION(:), INTENT(IN) :: out_modes ! modes INTEGER, DIMENSION(:,:), INTENT(IN) :: out_grib_keys ! grib specs INTEGER, DIMENSION(:), INTENT(IN) :: out_subset_size ! subset size INTEGER, DIMENSION(:,:), INTENT(IN) :: out_subdomain ! subdomain definition INTEGER, DIMENSION(:,:,:), INTENT(IN) :: out_gplist ! gp definition CHARACTER(LEN=*), DIMENSION(:,:), INTENT(IN) :: out_loclist ! locations definition CHARACTER(LEN=*), DIMENSION(:,:), INTENT(IN) :: out_spatial_filters ! Condition defining filter TYPE(ty_out_dred), DIMENSION(:), INTENT(IN) :: out_data_reduction ! Data reduction spec CHARACTER(LEN=*), DIMENSION(:), INTENT(IN) :: out_postproc_modules ! Specific postprocessing INTEGER, DIMENSION(:,:), INTENT(IN) :: nbr_gfield_spec !+ Specifications of TYPE(ty_fld_spec_root), DIMENSION(:), INTENT(IN) :: gen_spec !+ fields to generate INTEGER, INTENT(OUT) :: ierr ! Error status CHARACTER(LEN=*), INTENT(OUT) :: errmsg ! error message ! Local parameters CHARACTER(LEN=*), PARAMETER :: nm='generate_output: ' ! Tag ! Local variables LOGICAL :: exception_detected, exception, use_postfix LOGICAL :: unique_ftype, multiple_grid, exist LOGICAL, DIMENSION(3*mx_iteration+1) :: tmp_fddata_alloc, tmp_gpdata_alloc LOGICAL, DIMENSION(3*mx_iteration+1) :: tmp_value_alloc, tmp_flag_alloc INTEGER :: i1, i2, i3, i_fd, i_vd INTEGER :: nbr_input INTEGER :: out_idx, ios, idx_vd_defined CHARACTER(LEN=strlen) :: messg, temporal_res, out_path TYPE(ty_fld_type) :: out_ftype ! Initialize variables !--------------------- ierr = 0 ; errmsg = '' exception_detected = .FALSE. tmp_fddata_alloc(:) = .FALSE. ; tmp_gpdata_alloc(:) = .FALSE. tmp_value_alloc(:) = .FALSE. ; tmp_flag_alloc(:) = .FALSE. ! Create/update data cache file !------------------------------------------------------------------------- ! The cache file must reflect the state of data(:) after the last call to ! collect_output (i.e. before any field manipulation done in prepare_pout) ! Loop over each output file !--------------------------- output_file_loop: & DO i1 = 1, nbr_ofile out_idx = data(i1)%ofile_idx nbr_input = COUNT( data(i1)%ifile_used ) ! Skip bogus output IF ( data(i1)%ofile_bogus ) CYCLE output_file_loop ! Skip completed output IF ( data(i1)%ofile_complete ) CYCLE output_file_loop ! Skip empty data array IF ( ALL(.NOT. data(i1)%defined) ) CYCLE output_file_loop ! Only prepare output when all possible associated data have been collected ! or when 'just on time' production is active IF ( .NOT. last_call .AND. & nbr_input < tot_nbr_input(out_idx) .AND. & .NOT. just_on_time(out_idx) ) CYCLE output_file_loop ! At this point the corresponding output file will be produced ! Keep track of completed output file IF ( nbr_input >= tot_nbr_input(out_idx) ) data(i1)%ofile_complete = .TRUE. ! Build name of output, considering a possible temporary postfix use_postfix = .FALSE. IF ( LEN_TRIM(out_postfix) /= 0 .AND. data(i1)%ofile_usepostfix .AND. & .NOT. (data(i1)%ofile_firstwrite .AND. data(i1)%ofile_complete) ) & use_postfix = .TRUE. out_path = out_paths(out_idx) IF ( use_postfix ) out_path = TRIM(out_path) // out_postfix ! Release memory allocated in previous call to prepare_pout (if any) DO i2 = 1, 3*mx_iteration+1 IF ( tmp_value_alloc(i2) ) DEALLOCATE(data_tmp(i2)%values, data_tmp(i2)%defined) IF ( tmp_flag_alloc(i2) ) DEALLOCATE(data_tmp(i2)%flag) IF ( tmp_fddata_alloc(i2) ) THEN DEALLOCATE(data_tmp(i2)%field_type, data_tmp(i2)%field_origin, & data_tmp(i2)%field_name, data_tmp(i2)%field_grbkey, & data_tmp(i2)%field_trange, & data_tmp(i2)%field_level, data_tmp(i2)%field_ltype, & data_tmp(i2)%field_prob, data_tmp(i2)%field_epsid, & data_tmp(i2)%field_vref, data_tmp(i2)%field_ngrid, & data_tmp(i2)%field_scale, data_tmp(i2)%field_offset, & data_tmp(i2)%field_vop, data_tmp(i2)%field_vop_usetag, & data_tmp(i2)%field_vop_nlev, data_tmp(i2)%field_vop_lev, & data_tmp(i2)%field_pop, data_tmp(i2)%field_hop, & data_tmp(i2)%field_top, data_tmp(i2)%nbr_level, & data_tmp(i2)%level_idx, data_tmp(i2)%nbr_eps_member, & data_tmp(i2)%eps_member_idx, data_tmp(i2)%field_idx ) ENDIF IF ( tmp_gpdata_alloc(i2) ) THEN DEALLOCATE(data_tmp(i2)%gp_coord, data_tmp(i2)%gp_idx, & data_tmp(i2)%gp_lat, data_tmp(i2)%gp_lon, data_tmp(i2)%gp_h) ENDIF END DO ! Prepare data for print out (calculate new fields, ... ; populate data_pout) ! * Info message IF ( just_on_time(out_idx) ) THEN messg = ' (just on time output)' ELSE IF ( nbr_input >= tot_nbr_input(out_idx) ) THEN messg = ' (all associated input collected)' ELSE messg = '' ENDIF Thank you for your attention!

More Related