1 / 22

NWS Hydrology Forecast Verification Team: 17 th Meeting

NWS Hydrology Forecast Verification Team: 17 th Meeting. 06/29/2009 –2 pm EDT. Outline. Verification in CHPS: Demo Next steps Final team report: consensus on Recommended verification metrics and products RFC verification case studies Impact of QPF horizon

edison
Download Presentation

NWS Hydrology Forecast Verification Team: 17 th Meeting

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NWS Hydrology Forecast Verification Team:17th Meeting 06/29/2009 –2 pm EDT

  2. Outline • Verification in CHPS: • Demo • Next steps • Final team report: consensus on • Recommended verification metrics and products • RFC verification case studies • Impact of QPF horizon • Impact of run-time mods made on the fly • New Verification Team charter

  3. Verification in CHPS • 2 classes of verification with different user requirements • Diagnostic verification to know/improve model performance; done off-line with archived forecasts or hindcasts to compute verification statistics given different conditions (e.g., time periods, above threshold) • Real-time verification to help forecasters make decision in real-time; done in real-time by • querying and displaying historical analogs using multiple criteria • displaying summary of past diagnostic verification statistics • checking for forecast anomalies • if necessary, run bias-correction program and display result

  4. Verification in CHPS: demo • Demonstrate analog capability display • Select analogs from a pre-defined set of historical events and display with ‘live’ forecast in FEWS Time Series Display • Demonstrate summary diagnostic verification displays • Display summary diagnostic verification products in FEWS Map Display: map with 1 metric (e.g. Relative RMSE)

  5. Verification in CHPS: analog demo • Demonstrate analog capability display w/ FEWS Time Series Display • Select analogs from a pre-defined set of historical events and display with ‘live’ forecast Live forecast Observations

  6. Analog 2 Analog 3 Verification in CHPS: analog demo • Demonstrate analog capability display w/ FEWS Time Series Display • Select analogs from a pre-defined set of historical events and display with ‘live’ forecast Analog 1

  7. Verification in CHPS: analog demo • Demonstrate analog capability display w/ FEWS Time Series Display • Select analogs from a pre-defined set of historical events and display with ‘live’ forecast Analog 3 Live forecast What happened

  8. Verification in CHPS: analogs • Build analog query prototype using multiple criteria • Queries from forecast and/or observed attributes (e.g. forecast amount) on different time and/or space coordinates (e.g. same forecast lead time and location). Could include auxiliary variables (e.g. select flow analogs based on precipitation forcing). • Need to identify appropriate auxiliary information for selecting and interpreting the analogs (e.g. initial soil moisture conditions). • Should we just focus on a restricted range of flows, such as high flows? Of course, potential problems with sample size and uniqueness of major events.

  9. Examples of queries for analogs • Seeking analogs for precipitation: “Give me past forecasts for the 10 largest events relative to hurricanes for this basin.” • Seeking analogs for flow: “Give me all past forecasts for which the forecast value at lead hour 6 was within an interval [current 6-hr forecast value ± 20%] and the forecast peak occurred within the next 48 hours”.

  10. More complex queries for analogs • Seeking analogs for temperature: “Give me all past forecasts with lead time 12 hours whose ensemble mean was within 5% of the live ensemble mean.” • Seeking analogs for precipitation: “Give me all past forecasts with lead time 6 hours whose PoP was >=0.95, whose ensemble mean was >= 0.5 inches and for which lightening strikes were observed within a 50km radius of the forecast point at the forecast issue time.” • Seeking analogs for flow: “Give me all past forecasts with lead times of 12-48 hours whose probability of flooding was >=0.95, where the basin-averaged soil-moisture was > x and the immediately prior observed flow exceeded y at the forecast issue time”.

  11. January April Verification in CHPS: map demo • Demonstrate summary diagnostic verification displays w/ FEWS Spatial Display • Display verification map with monthly results of Relative RMSE October

  12. Lead Day 2 Lead Day 1 Lead Day 3 Verification in CHPS: map demo • Demonstrate summary diagnostic verification displays w/ FEWS Spatial Display • Display verification map with monthly results of Relative RMSE

  13. Verification in CHPS: next steps • Perform user analysis to identify functional requirements and meaningful verification products • To be done by NWS Verification Team, CAT/CAT2 Teams, Graphics Generator Requirements Team, and SCHs • New RFC verification case studies to work with proposed standards

  14. Verification in CHPS: time line • Summer ’09 - Winter ’10 • define user requirements for CHPS VS (HEP w/ RFCs & teams) • develop analog query prototype (wo/ database querying) (HEP w/ some RFCs) • Winter ’10 - ’11 • prepare functional/technical software design (HSEB w/ HEP) • Continue to enhance verification science for CHPS VS (HEP)

  15. Final team report: consensus? • Recommendations on • sets of verification metrics and products to be used at all RFCs • sensitivity analyses on impact of QPF horizon and impact of run-time mods made on the fly • New future team activities • Team work to be presented at the HIC meeting on 07/10 • approval of second team charter?

  16. Sensitivity analysis: QPF horizon • Goal: what is the optimized QPF horizon for hydrologic forecasts? • QPF horizon to test: • 0 (no QPF), 6-hr, 12-hr, 18-hr, 24-hr, 30-hr, 36-hr, 48-hr, 72-hr, 96-hr • Model states to use: • Similar to operational mods except mods that impact future states • Metadata to store which mods were used in these runs • What forecast to verify • 6-hr stage forecasts for 7-day window (longer for slow response basins)

  17. Sensitivity analysis: run-time MODs • Goal: do run-time mods made on the fly improve forecasts? • 4 scenarios • Operational forecasts (w/ all mods) • Forecasts w/ best available obs. and fcst. inputs wo/ on-the-fly mods • Forecasts w/ best available obs. inputs (no fcst) w/ all mods • Forecasts w/ best available obs. inputs (no fcst) wo/ on-the-fly mods • What forecast to verify • 6-hr stage forecasts for same window as in operations • Model states: • Carryover from 5 days ago (w/ past mods) + a priori mods (known before producing any forecast)

  18. Next meeting • 18th meeting: early September • Update on comments for team report • Discussion on • RFC verification case studies • CHPS Verification Service

  19. Thank you!Questions?

  20. Final team report: Metrics • 4 different levels of information

  21. Final team report: Analyses • Use different temporal aggregations • 6-hr instantaneous flow vs. weekly minimum flow • Avoid data pooling across different lead times • Quality strongly depends on lead time • Plot verification statistic as function of lead time • Perform spatial aggregation carefully • Aggregate verification results across basins with similar hydrologic processes • Analyze results for each individual basin and analyze results plotted on verification maps

  22. Final team report: Analyses • Analyze forecast performance • w/ time conditioning: by month, by season • w/ atmospheric/hydrologic conditioning: • low/high probability threshold • absolute thresholds (e.g., PoP, Flood Stage) • Plot also sample size (in future confidence intervals) • Analyze sources of uncertainty and error • Verify forcing input forecasts and output forecasts • For extreme events, verify both stage and flow • Sensitivity analysis to be set up at all RFCs: 1) impact of QPF horizon; 2) impact of run-time mods made on the fly

More Related