1 / 61

Precipitation Validation

Precipitation Validation. Hydrology Training Workshop University of Hamburg. Chris Kidd …and many others…. Overview. Precipitation characteristics. Surface measurements: Gauges, Radar. Validation: Case study: the European IPWG site. Experiences – other analysis.

traceyhyde
Download Presentation

Precipitation Validation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Precipitation Validation Hydrology Training Workshop University of Hamburg Chris Kidd …and many others…

  2. Overview Precipitation characteristics Surface measurements: Gauges, Radar Validation: Case study: the European IPWG site Experiences – other analysis Results – statistical dependency Conclusions Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  3. 2008 floods 2009 floods Why? – essentially to improve estimates

  4. UK Midlands: 20 July 2007

  5. Precipitation Characteristics • The ‘modal’ instantaneous precipitation value is zero • Rain intensities are skewed towards zero: at middle to high latitudes, heavily so! • Spatial/temporal accumulations will ‘normalise’ the data • 1 mm of rain ≡ 1 lm-2 or 1 Kg (or 1000 tkm-2) Occurrence Accumulation Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  6. Surface measurement Clee Hill radars (C-band vs ATC) Micro rain radar 0.2 mm/tip ARG100 gauge 0.1mm/tip Young’s Gauge

  7. Conventional measurements Gauge data (rain/snow) • Simple measurements of accumulations • Quantitative sampling (tipping bucket gauges etc) • But, point measurements, under-catch errors, etc. Radar systems • Backscatter from hydrometeors (rain/snow/hail) • Spatial measurements • Potential to discriminate between precipitation type • But, range effects, anomalous propagation errors, Z-R relationships… Precipitation is highly variable both temporally and spatially: measurements need to be representative Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  8. 20,000 Rain gauges Radarduplicates rain-gauge coverage Conventional Observations Precipitation is highly variable both temporally and spatially. Measurements need to be representative Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  9. Variance explained by nearest station Jürgen Grieser Variance based upon monthly data: shorter periods = lower explained variance Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  10. What is truth? Co-located 8 gauges / 4 MRRs Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  11. 1st gauge… Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  12. 2nd gauge… Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  13. 2 more gauges Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  14. All gauges Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  15. plus the MRR… Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  16. Radar vs Gauge measurements Cumulative Rainfall Radar vs gauge reasonable – but not quite 1:1 10 June 2009 : 40mm in 30mins MRR 24.1GHz Gauge, TBR Tipping bucket gauges provide quantised measurements (0.1 or 0.2 mm/tip) MRR critical for light rainfall Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  17. Clee-Hill ATCRadar and C-band University of Helsinki C-band ChilboltonC-Band Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  18. National network Radars: Doppler, dual polarised 100/210km

  19. Radar vs gauge data Radar (daily integrated) Gauge data Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  20. Helsinki Testbed • FMI Helsinki • Cold season – surface issues & mixed-phase precipitation to surface • Circles: 4 operational Doppler weather radars (FMI & EMHI), 1 Dual pol radar + 1 vertically pointing C-band radar for research (Vaisala & UH) • 2 vertically pointing POSS-radars • Dots: 80 gauges • Big diamonds: FD12P optical scatterometers • Triangles: ultrasonic snow depth • Squares: weighing gauges Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  21. Ground validation - IPWG synergies GV=Ground Validation After Turk & Arkin, BAMS 2008 Both approaches are complementary Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  22. Summary: surface measurements Representativeness of surface measurements: • Over land generally good, but variable • Over oceans: virtually none-existent Measurement issues: • Physical collection – interferes with measurement (e.g. wind effects – frozen precip, etc) • Radar – imprecise backscatter:rainfall relationship (also clutter, range effects, bright band, etc) Satellites offer consistent, regular measurements, global coverage, real-time delivery of data Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  23. Satellite Data sets

  24. Observation availability * Resolutions vary greatly with scan angle, frequency, sensor etc) Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  25. Satellite observational scales s Observations made nominally at 1km/15 mins: Estimates possible at 1km/1min but inaccurate Precipitation products generally available at 0.25 degree daily, or 0.25 degree, 3 hourly 25km 1km Earth Resources Satellites Precipitation systems Vis/IR MW 3 hours LEO Vis IR 15 minutes Accuracy of satellite precipitation estimates improve with temporal/spatial averaging GEO rapid scan Ikonos Spot Landsat MODIS mm Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  26. Low-Earth Orbit sensors SSM/I and TRMM Geostationary sensors Meteosat / MSG LEO vs GEO satellite observations

  27. Resolutions time/space Data inputs Climatology Agriculture/crops Meteorology Hydrology Visible Infrared Passive MW Active MW Monthly/seasonal Climate resolution Instantaneous Full resolution Model outputs Observations to Products O b s e r v a t i o n s R e t r i e v a l s P r o d u c t s Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  28. Global precipitation data sets Many different products at different spatial/temporal resolutions … and formats! Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  29. Setting up a validation site

  30. IPWG European validation • Radar used as 'ground truth' • Composite of radars over UK, France, Germany, Belgium and Netherlands • Nominal 5 km resolution • Equal-area polar-stereographic projection • Data and product ingest • Near real-time • Statistical and graphical output (SGI/Irix; f77/netpbm) Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  31. Processing setup Perceived requirements: • Daily inter-comparison→ 00Z-24Z (also -06, -09, -12Z) • 0.25 degree resolution→ 25 km resolution • Real-time→ near real-time dependent upon product • Validation data → radar data (gauge being added later) • Automatic → quasi-automatic (not ‘operational’) • Many products → limited number of products Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  32. Processing Schedule 01Z Global IR 02Z SSM/I data GPI FDA ECMWF 03Z European radar data PMIR 04Z 3B4x 05Z cics data Statistics at 20km EUMETSAT MPE Web pages 22Z Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  33. Processing system Initial setup: Setting of dates Cleaning out old/decayed data Remapping of data: … to regional grid or 5 km PSG projection… Results generation: Statistical analysis Graphical output Acquiring data: Searching existing data Listing missing data Creation of .netrc file ftp data sources Web pages: Generate HTML files Copying to server Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  34. foreach day (d0-d0-31) foreach product & day dn=dn+1 remap to PSG using LUTs & standardise format set d0=today standardise filename foreach datasource (s0-sn) foreach product & day foreach product (p1-pn) Generate statistics foreach day (d0-d0-31) Generate plots N if (product for day) !exist Y foreach product & day add to .netrc file generate HTML files ftp datasource (4k) Processing checks Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  35. foreach day (d0-d0-31) foreach product & day dn=dn+1 remap to PSG using LUTs & standardise format set d0=today standardise filename foreach datasource (s0-sn) foreach product & day foreach product (p1-pn) Generate statistics foreach day (d0-d0-31) Generate plots if (product for day) !exist Y N foreach product & day add to .netrc file generate HTML files ftp datasource (4k) Processing checks Set up list of past dates/days Usually okay: sometimes needs tweaking Prepares products into common format Usually okay… Checks for a products results: Okay if no results, but not if bad data Generates outputs: Okay if there is rain… Generates raw HTML: Occasional issues with server FTP runs several times: 4K buffer limit on macros Automated systems they are NOT!

  36. Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  37. Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  38. Validation data Precipitation product Occurrence comparison Accumulation comparison “Standard” Layout Contingency tables PoD/FAR/ HSS Scatter-plot Descriptive Statistics Cumulative distribution Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  39. PMIR results: Europe 2009-01-11 Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  40. PMIR results: Australia 2008-12-25 Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  41. Results: Snow problems Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  42. Results: rain extent Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  43. IPWG Inter-comparison regions Near real-time intercomparison of model & satellite estimates vs radar/gauge IPWG – International Precipitation Working Group (WMO/CGMS) Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  44. Monthly and seasonal validation Monthly and seasonal diagnostic validation summaries Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  45. Month 5-day day 3-hour Validation resolution At full resolution the correlation of estimated rain is low; averaging over time and space improves the picture Fine-scale data is generated so users get to decide on averaging strategy VAR vs. HQ (mm/hr) Feb. 2002 30°N-S Huffman 2/10

  46. Resolution vs Statistical Performance Performance can be improved just by smoothing the data! Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  47. Bacchiglione (1200 km2) Posina (116 km2) PMIR: 4km/30min 3B42RT: 1deg/3hr High:57.9 2 km 8 km 0.5 km 16 km 1 km 4 km Low:1.6 Validation through Hydrology Anagnostou & Hossain: Applications are resolution critical Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  48. Instantaneous analysis • AMSR precipitation product (v10) • instantaneous radar (3x5 scans averaged to 15 mins). • 5km resolution average to 50x50km • Regions of interest: - NSea, Atlantic, France, Germany, UK. • January 2005 - September 2009 Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

  49. Mean rainfall (mm/d) 2005-2009 Radar AMSR 0 0.1 0.3 0.5 1 2 4 8 Rainfall mm/d Hydrology Training Workshop: University of Hamburg, 12-14 October 2010

More Related