Skip this Video
Download Presentation
Overview of the WP5.3 Activities

Loading in 2 Seconds...

play fullscreen
1 / 14

Overview of the WP5.3 Activities - PowerPoint PPT Presentation

  • Uploaded on

Overview of the WP5.3 Activities. Partners: ECMWF, METO/HC, MeteoSchweiz, KNMI, IfM, CNRM, UREAD/CGAM, CNRS/IPSL, BMRC, CERFACS. Forecast quality assessment. Forecast quality assessment is a basic component of the prediction process.

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

PowerPoint Slideshow about ' Overview of the WP5.3 Activities' - carsyn

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

Overview of the WP5.3 Activities


forecast quality assessment
Forecast quality assessment

Forecast quality assessment is a basic component of the prediction process

Information about the quality and the uncertainty of the predictions is as important as the prediction itself

wp5 3 activities
WP5.3 activities

WP5.3: Assessment of s2d forecast quality

  • Target
    • “assessment of the actual and potential skill of the models and the different versions of the multi-model ensemble system“
  • Main tasks during the first 18 months:
    • Assessment of the actual and potential skill of the different ensemble prediction systems and sensitivity experiments, including a comparison with reference models (link WP4.4).
    • Estimate useful skill for end users in seasonal-to-decadal hindcasts to assess their potential economic value (link WP5.5).
    • Develop web-based verification technology (link WP2A.4).
    • Assessment of the skill in predicting rare events (link WP4.3 and WP5.4).
    • Other links: RT1, RT2A
wp5 3 activities1
WP5.3 activities

WP5.3: Assessment of s2d forecast quality

  • First 18 month deliverables:
    • 5.3 (UREAD/CGAM): Optimal statistical methods for combining multi-model simulations to make probabilistic forecasts of rare extreme events
    • 5.4 (UREAD/CGAM): Best methods for verifying probability forecasts of rare events
    • 5.7 (ECMWF): Skill of seasonal NAO and PNA using multi-model seasonal integrations from DEMETER
  • First 18 month milestone:
    • M5.2 (KNMI): Prototype of an automatic system for forecast quality assessment of seasonal-to-decadal hindcasts
  • First 18 month activity: ECMWF (3), MeteoSchweiz (1), UREAD/CGAM (0), CNRS/IPSL (6), KNMI (0), METO/HC(0)
wp5 3 action plan
WP5.3 action plan

WP5.3: Assessment of s2d forecast quality

  • Two different types of verification activities:
    • Automatic quality control
    • Research on verification
  • Research verification requires efficient data dissemination:
    • MARS, public server at ECMWF
    • Climate explorer
  • Need of a probabilistic model before doing probabilistic verification
  • Broad range of research studies, in close link with validation work in RT4 and RT5
  • Verification based on the end-to-end approach

Three-tier verification

  • Forecast quality needs to be assessed thoroughly also for end-user predictions, but there is no direct relationship between forecast quality and usefulness.
  • Use end-to-end approach: end-users develop prediction models taking into account prediction limitations.
  • Forecast reliability becomes a major issue.
  • A three tier scheme can then be considered:
    • Tier 1: single meteorological variables are assessed against a reference prediction (climatology, persistence, …)
    • Tier 2: application model hindcasts driven by weather / climate predictions are assessed against an application model reference (e.g., driven by ERA-40); no reference to real world application
    • Tier 3: as in tier 2, but the application model hindcasts are assessed against observed data
automatic quality control
Automatic quality control
  • Most of the s2d simulations run at ECMWF and have a common output
  • Need checking asap the quality (units, missing files, wrong data…) of the hindcasts produced
  • Verification suite running periodically with graphical output made available on the web
knmi climate explorer
KNMI Climate Explorer
  • An OPenDAP server allows the Climate Explorer to automatically access the ENSEMBLES data with no local copy of the whole data set.
  • The Climate Explorer performs correlations, basic probabilistic estimates, EOFs, plotting, etc.
  • The capabilities of the Climate Explorer will be expanded to allow for more tier-1 skill measures, including verification of probability forecasts and rare events (end 2006).
climate explorer
Climate explorer

T2m point correlation for DEMETER 1-month lead multi-model seasonal hindcasts (1959-2001)

From G. J. van Oldenborgh, KNMI


RPSS for unskilled (wrt climatology) forecasts

Müller, Appenzeller, Doblas-Reyes and Liniger, J. Clim., in press

From M. Liniger, MeteoSwiss

Tier-1 verification

Example: MeteoSwiss will work on the de-biased ranked probability skill score RPSSd

  • Conventional probabilistic skill scores based on the Brier score have a negative bias due to a finite ensemble size
  • How to compare forecasts from systems with low or even different ensemble sizes?
tier 1 verification
Tier-1 verification

Example: CNRS/IPSL will develop a tool based on the “local mode analysis” to test the skill of the ISO in seasonal predictions (beg. 2006)

Start 1st May (monsoon breaks)

Start 1st Nov (MJO)

Inter-annual correlation between simulated and observed OLR intraseasonal variance (90 day time section, 1 correlation every 5 days, 22 years) over the tropical Indian Ocean

From J.-Ph. Duvel, CNRS/IPSL

tier 3 verification
Tier-3 verification

DEMETER multi-model predictions (7 models, 63 members, Feb starts) of average wheat yield for four European countries (box-and-whiskers) compared to Eurostat official yields (black horizontal lines) and crop results from a simulation forced with downscaled ERA40 data (red dots).





From P. Cantelaube and J.-M. Terres, JRC

data dissemination

A service that offers immediate and free access to data from:





with monthly and daily data, select area and plotting facilities, GRIB or NetCDF formats

Data dissemination

Different depending on access granted to ECMWF systems:

  • access: MARS
  • no access: public data server and OPenDAP (DODS) server