1 / 16

CONVECTIVE FORECAST PERFORMANCE OF AN OPERATIONAL MESOSCALE MODELLING SYSTEM

CONVECTIVE FORECAST PERFORMANCE OF AN OPERATIONAL MESOSCALE MODELLING SYSTEM Anthony P. Praino, Lloyd A. Treinish IBM Thomas J. Watson Research Center, Yorktown Heights, NY. Forecast Study Details. Four Geographic Regions Examined Baltimore-Washington, Chicago Kansas City, New York

kenyon
Download Presentation

CONVECTIVE FORECAST PERFORMANCE OF AN OPERATIONAL MESOSCALE MODELLING SYSTEM

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CONVECTIVE FORECAST PERFORMANCE OF AN OPERATIONAL MESOSCALE MODELLING SYSTEM Anthony P. Praino, Lloyd A. Treinish IBM Thomas J. Watson Research Center, Yorktown Heights, NY

  2. Forecast Study Details • Four Geographic Regions Examined • Baltimore-Washington, Chicago • Kansas City, New York • Seven Convective Events Studied • Baltimore-Washington: 1 case • Chicago: 2 cases • Kansas City: 1 case • New York: 3 Cases

  3. Forecast Model Description • Deep Thunder: Highly Customized Version of RAMS • 24 Hour Forecast Period • Full Three Dimensional Non-Hydrostatic • 31 Vertical Levels • Triple Nested Configuration • 16km, 4km, 1km – NY • 32km, 8km, 2km – BW, Chi, KC • Two Way Interactive Domains • Explicit Cloud Microphysics

  4. Model Domain Configurations Study Focused On Inner Domains

  5. Quantitative Study • Specific Airport or Metar Locations Used • NYC, EWR, JFK, LGA in New York Domain • ORD, MDW in Chicago Domain • FDK in Baltimore-Washington Domain • MKC, MCI in Kansas City Domain • Datasets • Metar Reports, Daily Climate Summaries • Radar Precipitation Estimates

  6. Quantitative Summary • Precipitation Onset • Model Was Late in 13 of 16 Cases • Mean Error 1.8 Hours • Mean Observation Uncertainty 39 min. • Precipitation Cessation • Model Was Late in 13 of 16 Cases • Mean Error 3 Hours • Mean Observation Uncertainty 38 min.

  7. Quantitative Summary • Precipitation Accumulation • Model Underpredicted in 9 of 16 Cases • Model Overpredicted in 7 of 16 Cases • Mean Error 0.6 inches • Wind Speed Maxima • Model Underpredicted in 9 of 16 Cases • Model Overpredicted in 7 of 16 Cases • Mean Error 9 mph

  8. Qualitative Study • Radar Composite Reflectivity Compared To Model Predictions For Overall Storm Structure, Intensity And Timing • Radar Total Precipitation Compared To Model For Spatial Distribution And Accumulation Of Rainfall • NWS Upton Radar – New York • NWS Chicago Radar – Chicago • NWS Kansas City Radar – Kansas City • NWS Sterling, VA Radar – Baltimore/DC

  9. Radar Image 28 Oct 0120 UTC New York Region Deep Thunder Model Prediction 28 Oct 0120 UTC

  10. Radar Image 02 March 130 UTC Chicago Region Deep Thunder Model Prediction 28 Oct 0120 UTC

  11. Radar Image 18 May 1230 UTC Kansas City Region Deep Thunder Model Prediction 18 May 1230 UTC

  12. Radar Image 16 Oct 2038 UTC Baltimore/Washington Region Deep Thunder Prediction 16 Oct 2030 UTC

  13. Qualitative Summary • Deep Thunder Exhibited Considerable Skill in Modelling Structure and Spatial Distribution of Convective Events • Model Predictions Available With Significant Lead Time (mean 6.5 hours) Before Storms Impacted Area

  14. Summary • Deep Thunder Demonstrates Good Skill In Modelling Convective Events • Negative Bias in Precipitation Timing • Positive Bias in Precipitation Amount • Model Predictions In Several Cases Had Considerable Lead Time When Compared to Other Forecast Data

  15. Future Work • Use Of High Resolution Eta 218 Data For Model Initial And Boundary Conditions • Application Of Additional Mesonet Data For Point Specific Model Verification • Application Of Model Ensemble Methodology

  16. Model Predictions & Observed Results

More Related