1 / 26

Forecast Verification Research

Forecast Verification Research. Laurie Wilson, Environment Canada Beth Ebert, Bureau of Meteorology WWRP-JSC, Geneva, 17-19 July, 2013. Verification working group members. Beth Ebert (BOM, Australia) Laurie Wilson (CMC, Canada) Barb Brown (NCAR, USA) Barbara Casati (Ouranos, Canada)

magda
Download Presentation

Forecast Verification Research

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Forecast Verification Research Laurie Wilson, Environment Canada Beth Ebert, Bureau of Meteorology WWRP-JSC, Geneva, 17-19 July, 2013

  2. Verification working group members • Beth Ebert (BOM, Australia) • Laurie Wilson (CMC, Canada) • Barb Brown (NCAR, USA) • Barbara Casati (Ouranos, Canada) • Caio Coelho (CPTEC, Brazil) • Anna Ghelli (ECMWF, UK) • Martin Göber (DWD, Germany) • Simon Mason (IRI, USA) • Marion Mittermaier (Met Office, UK) • Pertti Nurmi (FMI, Finland) • Joel Stein (Météo-France) • Yuejian Zhu (NCEP, USA)

  3. Aims Verification component of WWRP, in collaboration with WGNE, WCRP, CBS • Develop and promote new verification methods • Training on verification methodologies • Ensure forecast verification is relevantto users • Encourage sharing of observational data • Promote importance of verification as a vital part of experiments • Promote collaboration among verification scientists, model developers and forecast providers

  4. Relationships / collaboration WGCM WGNE TIGGE SDS-WAS CIMO - SPICE Polar Prediction SWFDP HIW CBS operational verification S2S SRNWP COST-731

  5. Front Page HEADLINES • Final draft of “Verification of Tropical Cyclone Forecasts” has been released for comment and feedback. • Comments to be received until the end of August, then document to be published by WMO. • The science of verification methods has advanced with the publication of a special issue of Meteorological Applications, June, 2013, containing 12 papers based on presentations at the Melbourne workshop 2011.

  6. Promotion of best practice Verification of tropical cyclone forecasts Introduction Observations and analyses Forecasts Current practice in TC verification – deterministic forecasts Current verification practice – Probabilistic forecasts and ensembles Verification of monthly and seasonal tropical cyclone forecasts Experimental verification methods Comparing forecasts Presentation of verification results

  7. Verification of deterministic TC forecasts

  8. Beyond track and intensity… Track error distribution TCgenesis Wind speed Precipitation (MODE spatial method)

  9. Verification of probabilistic TC forecasts TIGGE ensembleintensity error before bias correction After bias correction Courtesy Yu Hui (STI)

  10. Verification of TC seasonal frequencies

  11. Met Apps special issue 1. Progress and challenges in forecast verification E. Ebert, L. Wilson, A. Weigel, M. Mittermaier, P. Nurmi, P. Gill, M. Göber, S. Joslyn, B. Brown, T. Fowler and A. Watkins 2. A unified verification system for operational models from Regional Meteorological Centres of China Meteorological Administration Jing Chen, Yu Wang, Li Li, Bin Zhao, Fajing Chen, Yinglin Li and Yingjie Cui 3. Factors affecting the quality of QPF: a multi-method verification of multi-configuration BOLAM reforecasts against MAP D-PHASE observations Marco Casaioli, Stefano Mariani, Piero Malguzzi and Antonio Speranza 4. An assessment of the SEEPS and SEDI metrics for the verification of 6 h forecast precipitation accumulations Rachel North, Matthew Trueman, Marion Mittermaier and Mark J. Rodwell 5. A long-term assessment of precipitation forecast skill using the Fractions Skill Score Marion Mittermaier, Nigel Roberts and Simon A. Thompson 6. Using MODE to explore the spatial and temporal characteristics of cloud cover forecasts from high-resolution NWP models M. P. Mittermaier and R. Bullock 7. Exploratory use of a satellite cloud mask to verify NWP models Ric Crocker and Marion Mittermaier 8. A new index for the verification of accuracy and timeliness of weather warningsLaurence J. Wilson and Andrew Giles 9. Expected impacts and value of improvements in weather forecasting on the road transport sector Pertti Nurmi, Adriaan Perrels and Väinö Nurmi 10. Verification of marine forecasts using an objective area forecast verification system Michael A. Sharpe 11. Comparative skill assessment of consensus and physically based tercile probability seasonal precipitation forecasts for Brazil Caio A. S. Coelho 12. Three recommendations for evaluating climate predictions Thomas E. Fricker, Christopher A. T. Ferro and David B. Stephenson

  12. SEDI for ECMWF vs UKMet 6 h precip forecasts FROM: North et al 2013

  13. 1-SEEPS for UKMet and ECMWF 6h precip fcsts FROM: North et al, 2013

  14. Comparison of physical and statistical tercile precip probability forecast accuracy for Brazil Uses “Generalized Discrimination” score (Mason and Wiegel, 2009) In this case, spring And summer, the Consensus forecasts “win” From: Coehlo, 2013

  15. FDPs and RDPs Sydney 2000 FDP Beijing 2008 FDP/RDP FROST-14 FDP/RDP -Participants to do own verification -JWGFVR to assist with special data and road verification -JWGFVR to establish compulsory measures for verification SCMREX and INCE-CE -Requests for advice, projects starting MAP D-PHASE SNOW-V10 RDP Typhoon Landfall FDP -Mainly training sessions at workshops so far Severe Weather FDP

  16. Verification of model precipitation forecasts for E. Africa All GTS data received by ECMWF and NCEP for 2010-11 rainy season. The full set of results is being written up and will be published by WMO later this year as part of an SWFDP verification training document

  17. RSMC chart and Hydro-est

  18. Spatial Verification Method Intercomparison Project • Falls under “promotion of best practice in verification” • International comparison of many new spatial verification methods • Phase 2 in planning stage • Complex terrain • MAP D-PHASE / COPS dataset • Wind and precipitation, timing errors • Case selection underway • More information at EMS/ECAM Reading September • Led by Eric Gilleland (NCAR)

  19. Outreach and training http://www.cawcr.gov.au/projects/verification/ • Verification workshops and tutorials • On-site, travelling • Ensemble verification methods (EMS/ECAM) Sept 8, Reading • East Africa SWFDP • EUMETCAL training modules • SWFDP verification document • Verification web page • Sharing of tools • Proposal for 6th International Verification Methods Workshop

  20. Proposal for 6th International Verification Methods Workshop • Invited by Indian Meteorological Department (IMD) and National Center for Medium Range Weather Forecasting (NCMRWF) • March 13-19, 2014 • Similar format to previous workshops (Three day tutorial – one day off – three day science workshop) • Special emphasis on Monsoon verification and Tropical Storm verification in both tutorial and workshop.

  21. Topics for 6IVMW • Verification of high impact weather forecasts and warnings, especially tropical cyclones and monsoon events. • Verification of ensembles and probability forecasts • Spatial forecast verification • Seasonal forecast verification • Climate projection evaluation • Propagation of uncertainty • User issues including communicating verification to decision makers • Verification tools

  22. NWP climate change decadal prediction seasonal prediction sub- seasonal prediction global very short range regional Spatial scale nowcasts local point minutes hours days weeks months years decades forecast aggregation time Seamless verification Seamless forecasts - consistent across space/time scales single modelling system or blended likely to be probabilistic / ensemble

  23. More thoughts on seamless verification • Principles of all verification: • Why is it being done? What does the user want to know about the forecast? • “Attributes of forecast” – reliability, discrimination, accuracy, skill etc. • Verification doesn’t care about the source of the forecast, or its presentation – that is decided by the forecast user (hopefully) • It does care about the nature of the predictand (deterministic/continuous, categorical, probabilistic) • Attributes can be measured for any forecast projection, or averaging period; the meaning is similar. • Example: Generalized discrimination score

  24. Near Future Events • WMO publication of TC document following comments • EMS one-day ensemble verification training workshop • Advice on precipitation verification metrics for WGNE (by Sept 2013) • JWGFVR-WGTMR joint meeting (Oct 2013) • 6IVMW (March 2014)

  25. Final thoughts • “Good will” participation (beyond advice) in WWRP and THORPEX projects getting harder to provide • Videoconferencing • Capacity building of “local” scientists • Include verification component in funded projects • Tendency towards “Verification within” • May be fine for research users of verification only • Not consistent with “best verification practice” when other users are considered, e.g. SERA • SOCHI • HyMEX • A change from the original intent of JWGFVR – the INDEPENDENT verification of products from an RDP/FDP

  26. Thank you

More Related