1 / 25

Beth Ebert Bureau of Meteorology Research Center Melbourne, Australia

Monitoring the Quality of Operational and Semi-Operational Satellite Precipitation Estimates – The IPWG Validation / Intercomparison Study. Beth Ebert Bureau of Meteorology Research Center Melbourne, Australia. 2 nd IPWG Meeting, Monterey, 25-28 October 2004.

sine
Download Presentation

Beth Ebert Bureau of Meteorology Research Center Melbourne, Australia

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Monitoring the Quality of Operational and Semi-Operational Satellite Precipitation Estimates – The IPWG Validation / Intercomparison Study Beth Ebert Bureau of Meteorology Research Center Melbourne, Australia 2nd IPWG Meeting, Monterey, 25-28 October 2004

  2. Motivation – provide information to... • Me...! fill the blank spot • Algorithm developers • How well is my algorithm performing? • Where/when is it having difficulties? • How does it compare to the other guys? • Climate researchers • Do the satellite rainfall products give the correct rain amount by region, season, etc? • Hydrologists • Are the estimated rain volumes correct? • NWP modelers • Do the satellite products put the precipitation in the right place? • Is it the right type of precipitation? • Forecasters and emergency managers • Are the timing, location, and maximum intensities correct?

  3. http://www.bom.gov.au/bmrc/wefor/staff/eee/SatRainVal/sat_val_aus.htmlhttp://www.bom.gov.au/bmrc/wefor/staff/eee/SatRainVal/sat_val_aus.html Web page for Australia – home

  4. Earlier studies GPCP Algorithm Intercomparison Programs (AIPs) and WetNet Precipitation Intercomparison Programs (PIPs) found: • Performance varied with sensor • Passive microwave estimates more accurate than IR and VIS/IR estimates for instantaneous rain rates • IR and VIS/IR slightly more accurate for daily and monthly rainfall due to better space/time sampling • Performance varied with region and season • Tropics better than mid- and high latitudes • Summer better than winter (convective better than stratiform) • Model reanalyses performed poorer than satellite algorithms for monthly rainfall in tropics, but competitively in mid-latitudes (PIP-3)

  5. More recent studies • Combination of microwave and IR gives further improvement at all time scales • Good accuracy of microwave rain rates • Good space/time sampling from IR (geostationary) • Strategies • Weighted combination of estimates • Using match-ups of microwave and geostationary estimates • Get a field of multiplicative correction factors • Tune parameters of IR algorithm • Map IR TB onto microwave rain rates • Morphing of successive microwave estimates using time evolution from geostationary imagery • Paradigm for GPM?

  6. Focus of IPWG validation / intercomparison study • Updated evaluation of satellite rainfall algorithms

  7. Quantitative Precipitation Forecasts (QPFs) from Numerical Weather Prediction (NWP) • WCRP Working Group on Numerical Experimentation (WGNE) has been validating / intercomparing model QPFs since 1995 • Results • Performance varies with region and season • Mid-latitudes better than tropics • Winter better than summer (stratiform better than convective) • NWP performance is complementary to satellite performance! NWP performance over Germany

  8. Foci of IPWG validation / intercomparison study • Updated evaluation of satellite rainfall algorithms • Where, when, under which circumstances is NWP rainfall better than satellite rainfall, and visa versa?

  9. Related studies http://rain.atmos.colostate.edu/CRDC/

  10. Related studies http://ldas.gsfc.nasa.gov/GLDAS/DATA/precip_valid.shtml Observed Precipitation Validation

  11. Parameters of study • Evaluate estimates for at least one year to get seasonal variations in performance • As many different regions (climate regimes) as possible • So far: • Australia • United States • Western Europe • Any volunteers for Asia? Elsewhere? • Focus on daily rainfall • Rain gauge and radar rainfall analyses used as reference data • Focus on relative accuracy • Global estimates archived at U. Maryland

  12. Algorithms • Operational and semi-operational algorithms • Run every day • Available to public via web or FTP • Experimental algorithms OK • Sorted by sensor type • Microwave • IR or VIS/IR • Microwave + IR • Blending strategy NWP models • Global models (ECMWF, US) • Lower spatial resolution, global coverage • Regional models • Higher spatial resolution, limited coverage

  13. Evaluation methodology • Daily rainfall estimates of • Rain occurrence • Rain amount • Spatial resolution • Finest possible resolution (typically 0.25° lat/lon) • Coarser resolution (1° lat/lon) for comparison with NWP • Stratify by • Season • Region • Algorithm type • Algorithm • Rain amount threshold

  14. Verification methods • Rain occurrence • Frequency bias • Probability of detection and false alarm ratio • Equitable threat score • Rain amount • Multiplicative bias • RMS error • Correlation coefficient • Probability of exceedance • Properties of rain systems • Contiguous Rain Area (CRA) validation method (Ebert and McBride, 2000) • Rain area, volume, maximum amount • Spatial correlation • Error decomposition into volume vs. pattern

  15. Some results for Australia...

  16. User page • Targeted to external users of satellite rainfall products

  17. Developer page • Targeted to algorithm developers – contains more algorithms, some of which aren't publicly available (at least not easily)

  18. Multi-algorithm maps • All algorithms and NWP models for 30 September 2004 over Australia

  19. Basic daily validation product • Maps and statistics

  20. Daily CRA validation • Properties of rain system • Area • Mean and maximum rain accumulation • Rain volume • Spatial correlation • Error decomposition into volume and pattern error components

  21. Monthly and seasonal summaries • Variety of statistical plots • Time series • Scatter plots • Table of statistics • Binary (categorical) scores as a function of rain threshold • Error as a function of estimated (observed) rain rate

  22. Intercomparison of algorithm types Australian Tropics Australian Mid-latitudes Multiplicative bias December 2002-September 2004 1° grid summer autumn winter spring

  23. Intercomparison of algorithms Australian Tropics Australian Mid-latitudes POD December 2002-September 2004 1° grid

  24. Caveats • Reference data (gauge and radar analyses) are not as accurate as targeted ground validation sites • Performance results more meaningful in a relative sense than in an absolute sense • No ocean validation • Microwave algorithms are expected to have better performance over ocean because emission signal is used • Therefore microwave+IR algorithms should also perform better over ocean • NWP QPFs perform better over land than over ocean since more observations used in model initialization • Not all algorithms cover the same period (some missing data)

  25. Future of this study • Results so far will be examined closely and written up for publication • Satellite precipitation validation / intercomparison will continue into the future... • Algorithm developers • Keep making your results available • Good opportunity to check new or updated algorithms • Reference data providers • Thanks for data currently provided • More is better! • Can you assist in the validation itself? • Users of validation results • Are we giving you the information you need? • Please provide feedback and suggestions for improvement

More Related