1 / 24

JEFS Project Update And its Implications for the UW MURI Effort

JEFS Project Update And its Implications for the UW MURI Effort. Cliff Mass Atmospheric Sciences University of Washington. ENSEMBLES AHEAD. JEFS. Joint Ensemble Forecast System (JEFS). NCAR. JEFS’ Goal. Deterministic Forecasting . Ensemble Forecasting. ?. …etc.

zeki
Download Presentation

JEFS Project Update And its Implications for the UW MURI Effort

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. JEFS Project Update And its Implications for the UW MURI Effort Cliff Mass Atmospheric Sciences University of Washington

  2. ENSEMBLES AHEAD JEFS

  3. Joint Ensemble Forecast System(JEFS) NCAR

  4. JEFS’ Goal Deterministic Forecasting Ensemble Forecasting ? …etc • Ignores forecast uncertainty • Potentially very misleading • Oversells forecast capability • Reveals forecast uncertainty • Yields probabilistic information • Enables optimal decision making Prove the value, utility, and operational feasibility of ensemble forecasting to DoD operations.

  5. J E F S T E A M & AFIT

  6. Joint Global Ensemble (JGE) • Description: Combination of current GFS and NOGAPS global, medium-range • ensemble data. Possible expansion to include ensembles from CMC, • UKMET, JMA, etc. • Initial Conditions: Breeding of Growing Modes 1 • Model Variations/Perturbations: Two unique models, but no model perturbations • Model Window: Global • Grid Spacing: 1.0 1.0 (~80 km) • Number of Members: 40 at 00Z • 30 at 12Z • Forecast Length/Interval: 10 days/12 hours • Timing • Cycle Times: 00Z and 12Z • Products by: 07Z and 19Z 1Toth, Zoltan, and Eugenia Kalnay, 1997: Ensemble Forecasting at NCEP and the Breeding Method. Monthly Weather Review: Vol. 125, No. 12, pp. 3297–3319.

  7. Joint Mesoscale Ensemble (JME) 5km 15km • Description: Multiple high resolution, mesoscale model runs generated at FNMOC • and AFWA • Initial Conditions: Ensemble Transform Filter2run on short-range (6-h), • mesoscale data assimilation cycle driven by GFS and NOGAPS • ensemble members • Model variations/perturbations: • Multimodel: WRF-ARW, COAMPS • Varied-model: various configurations of physics packages • Perturbed-model: randomly perturbed sfc boundary conditions (e.g., SST) • Model Window: East Asia • Grid Spacing: 15 km for baseline JME • 5 km nest later in project • Number of Members: 30 (15 run at each DC site) • Forecast Length/Interval: 60 hours/3 hours • Timing • Cycle Times: 06Z and 18Z • Products by: 14Z and 02Z ~7h production /cycle 2Wang, Xuguang, and Craig H. Bishop, 2003: A Comparison of Breeding and Ensemble Transform Kalman Filter Ensemble Forecast Schemes. Journal of the Atmospheric Sciences: Vol. 60, No. 9, pp. 1140–1158.

  8. UW team making major contributions to the JEFS mesoscale system including: Observation-based bias correction on a grid Localized BMA Work on a variety of output products UW MURI Contributions

  9. Ensemble Model Perturbations a. Improvement of multi-model approach (0.5 FTE) The current method to account for model uncertainty in the JME, developed by NCAR in FY06, includes a multi-model component (i.e., each ensemble member represents a unique model configuration or combination of physics schemes) and perturbations to the surface boundary conditions (SST, albedo, roughness length, moisture availability). This method will be further improved by the following additions. 1) Incorporation of additional physics schemes. 2) Tuning of sea surface temperature (SST) perturbation. 3) Addition of soil condition perturbation. (0.25 FTE) NCAR Contributions

  10. Development of new approaches 1) Multiple-parameter (single-model) approach. NCAR shall examine the representation of model uncertainty through the use of a single, fixed set of model physics schemes in which various internal parameters and "constants" of each scheme are varied among the ensemble members. 2) Stochastic-model approach. NCAR shall adapt to WRF a stochastic modeling approach (stochastic physics or stochastic kinetic energy backscatter). 3) Hybrid approach. As the most straightforward hybrid method, NCAR shall apply the developed stochastic-model approach on top of the multi-model approach. NCAR Contributions

  11. Evaluation of approaches (0.4 FTE) MMM shall evaluate the different approaches for diversity that properly represent model uncertainty. Determination of best approach and assistance with implementation NCAR

  12. Ensemble Post-processing Calibration The University of Washington Atmospheric Sciences Department (UW) on developing algorithms for post-processing calibration of mesoscale ensembles. This development effort is crucial for optimizing the skill of ensemble products and maximizing JME utility. The UW shall: a. Expand model bias correction. The observation-based, grid bias correction developed in FY06 for 2-m temperature will be extended to additional variables of interest to include, but not be limited to, 2-m humidity, 10-m winds, and cumulative precipitation (rain and snow). b. Develop ensemble spread correction. The prototype Bayesian Model Averaging (BMA) post-processing system developed in FY06 shall be fully developed for the same variables as noted for bias correction. c. Evaluate developments. The UW shall evaluate these calibration techniques to determine the gain in ensemble forecast skill. UW Contributions 2007

  13. 3.3 Ensemble Products and Applications For FY07, NCAR/MMM shall continue subcontract work with UW on developing JME products and applications. The UW, under direction of NCAR, shall develop the following prototypes. These deliverables are initial efforts that do not require delivery of finalized software and documentation. a. Extreme forecast index. The UW shall research state-of-art methods for calculating an ensemble-based extreme forecast index and develop a prototype capability for the JME. This essentially is the process of comparing the current ensemble forecast with the ensemble model’s “climatology” to determine the likelihood of an extreme event, one that might not even be represented within the ensemble. b. General user interface. The UW shall build a web-based, interactive JME interface for the general DoD user designed to provide basic stochastic weather forecast information. This will be similar in nature to the current Probcast interface (http://www.probcast.com/) except geared to address the specific interests of military operations (e.g., probability of low ceiling and visibility). UW JEFS

  14. The UW team will expand in 2007 to include several members of the UW Statistics Deparment. Potential for further expansion in FY 2008. UW Contributions

  15. Product Strategy Tailor products to customers’ needs and weather sensitivities Forecaster Products/Applications  Designto help transition from deterministic to stochastic thinking Warfighter Products/Applications  Design to aid critical decision making (Operational Risk Management) UW will aid in developing some of these products

  16. Operational Testing & Evaluation PACIFIC AIR FORCES Forecasters 20th Operational Weather Squadron 17th Operational Weather Squadron 607 Weather Squadron Warfighters PACAF 5th Air Force Naval Pacific Meteorological andOceanographic Center Forecasters Yokosuka Navy Base Warfighters 7th Fleet SEVENTH Fleet FIFTH Air Force

  17. Forecaster Products/Applications

  18. Consensus & Confidence Plot Maximum Potential Error (mb, +/-) 6 5 4 3 2 1 <1 • Consensus (isopleths): shows “best guess” forecast (ensemble mean or median) • Model Confidence (shaded) • Increase Spread in Less Decreased confidence • the multiple forecasts Predictability in forecast

  19. Probability Plot % • Probability of occurrence of any weather phenomenon/threshold (i.e., sfc wnds > 25 kt) • Clearly shows where uncertainty can be exploited in decision making • Can be tailored to critical sensitivities, or interactive (as in IGRADS on JAAWIN)

  20. Multimeteogram Current Deterministic Meteogram • Show the range of possibilities for all meteogram-type variables • Box & whisker, or confidence interval plot is more appropriate for large ensembles • Excellent tool for point forecasting (deterministic or stochastic)

  21. Sample JME Products Probability of Warning Criteria at Osan AB When is a warning required? What is the potential risk to the mission? Valid Time (Z) Surface Wind Speed at Misawa AB Extreme Max Requires paradigm shift into “stochastic thinking” Mean 90% CI Extreme Min 11/18 12/00 06 12 18 13/00 06 12 18 14/00 06 Valid Time (Z)

  22. Warfighter Products/Applications

  23. Bridging the Gap Integrated Weather Effects Decision Aid (IWEDA) Deterministic Forecast Weapon System Weather Thresholds* Stochastic Forecast Probabilistic IWEDA -- for Operational Risk Management (ORM) > 13kt 10-13kt 0-9kt 10% Drop Zone Surface Winds 6kt Drop Zone Surface Winds 6kt 20% 70% 3 6 9 12 15 18kt *AFI 13-217 Stochastic Forecast Binary Decisions/Actions AR Route Clear & 7 Go / No Go T-Storm Within 5 ? IFR / VFR GPS Scintillation Bombs on Target Crosswinds In / Out of Limits Flight Hazards

  24. Method #2:Weather Risk Analysis and Portrayal (WRAP)

More Related