1 / 24

Data Integration: Assessing the Value and Significance of New Observations and Products

This article discusses the goals and approaches of integrating new data into NextGen 4-D data cube for SAS products. It also explores the use of heuristic forecast generation and automated data importance evaluation using random forests.

cclouse
Download Presentation

Data Integration: Assessing the Value and Significance of New Observations and Products

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data Integration: Assessing the Value and Significance of New Observations and Products John Williams, NCAR Haig Iskenderian, MIT LL NASA Applied Sciences Weather Program Review Boulder, CO November 19, 2008

  2. Data Integration • Goals • Integrate NASA-funded research into NextGen 4-D data cube for SAS products and decision support • Evaluate potential of new data to contribute to NextGen product skill, in context of other data sources • Provide feedback on temporal/spatial scales and operationally significant scenarios where new data may contribute • Approaches • Perform physically-informed transformations and forecast system integration, e.g., into fuzzy logic algorithm • Use nonlinear statistical analysis to evaluate new data importance in conjunction with other predictor fields • Implement, evaluate and tune the system

  3. Example of Forecast System Integration: SATCAST integration into CoSPA

  4. CoSPA 0-2 hour Forecasts NEXRAD TDWR LLWAS ASOS Surface Weather Lightning Canadian Weather Radar Air Traffic Managers CoSPA Situation Display CoSPA Weather Product Generator Airline Dispatch Satellite Decision Support Tools Numerical Forecast Models

  5. Overview of Heuristic Forecast

  6. Generation of Interest Images • Interest Images: • Are VIL-like (0-255) images that have a high impact upon evolution and pattern of future VIL • Result from combining individual predictor fields using expert meteorological knowledge and image processing for feature extraction

  7. Creating Interest ImagesConvective Initiation Forecast Engine Predictor Fields Image Processing Feature Extraction Lower Tropospheric Winds/Speed Cumulus Number CI Indicators & Visible Orientation and elongation of elliptical kernel prescribed by winds Favorable for CI Unfavorable for CI Locations prescribed by CI Scores CI Interest Stability Mask Regional CI Weights

  8. Feature ExtractionWeather Classification Embedded Stratiform Large Airmass Line Small Airmass

  9. Overview of Heuristic Forecast

  10. Forecast EngineCombine Interest Images Long-term Trend Short-term Trend Satellite Interest RADAR Boundary VIL . . . . . S (weight * Pixel Value) S weight P(t,pixel,wxtype)= Weather Type Image Combined Forecast Image

  11. Example of VIL Interest Evolution

  12. Summary of Heuristic Approach and Limitations • Individual interest images are each 0-255 VIL-like images resulting from a combination of predictor fields and feature extraction • Forecast is a weighted average of all interest images dependent on lead time and WxType, with weights determined heuristically • Combines static set of interest images into 0-2 hour forecasts • Storm evolution is embedded in the weights, dependent on WxType • Limitations: • The process of integrating a candidate predictor is a manual, time-intensive process • The utility of the predictor or an interest image to the forecast is known only qualitatively • There may be other predictor fields and interest images that would be helpful that are not being currently used • Interest image weights and evolution functions may not be optimal • An objective method could help address these issues

  13. Automated Data Importance Evaluation: Random Forests

  14. Random Forest (RF) • A non-linear statistical analysis technique • Produces a collection of decision trees using a “training set” of predictor variables (e.g., observation and model datafeatures) and associated “truth” (e.g., future storm intensity) values • each decision tree’s forecast logic is based on a random subset of data and predictor variables, making it independent from others • during training, random forests produce estimates of predictor importance

  15. Example: CoSPA combiner development(focus on 1 hour VIP level prediction) • Analyzed data collected in summer 2007 • Radar, satellite, RUC model, METAR, MIT-LL feature fields, storm climatology and satellite-based land use fields • Transformations • distances to VIP thresholds; channel differences • disc min, max, mean, coverage over 5, 10, 20, 40 and 80-km radii • Used motion vectors to “pull back” +1 hr VIP truth data to align with analysis time data fields • For each problem, randomly selected balanced sets of “true” and “false” pixels from dataset and trained RF • VIP  3 (operationally significant convection) • initiation at varying distances from existing convection • Plotted ranks of each predictor (low rank is good) for various scenarios

  16. VIL8bit 06/19/2007 23:30

  17. VIL8bit_40kmMax 06/19/2007 23:00

  18. Example fields VIL8bit_40kmPctCov 06/19/2007 23:30

  19. VIL8bit_distVIPLevel6+ 06/19/2007 23:30

  20. Importance summary for VIP  3 (var. WxType) more importantless important Importance Rank MITLL WxType

  21. Importance summary for init 20 km from existing storm more importantless important Importance Rank MITLL WxType

  22. Importance summary for init 80 km from existing storm more importantless important Importance Rank MITLL WxType

  23. RF empirical model provides a probabilistic forecast performance benchmark RF Empirical Model Performance: VIP  3 Calibration Fract. Instances with VIP >= 3 ROC Curve (blue) Random Forest votes for VIP >= 3

  24. Summary and Conclusions • Developing satellite-based weather products may be only the first step of their integration into an operational forecast system • Integration into an existing forecast system may require physically-informed transformations and heuristics • An RF statistical analysis can help evaluate new candidate predictors in the context of others • Relative importance • Feedback on scales of contribution • Also supplies an empirical model benchmark • Successful operational implementation may require additional funding beyond initial R&D

More Related