1 / 39

Joint Ensemble Forecast System (JEFS) for DoD Operations

The Joint Ensemble Forecast System (JEFS) aims to provide probabilistic information and enable optimal decision-making for DoD operations by utilizing ensemble forecasting. This overview discusses the motivation, requirements, resources, system design, roadmap, and applications of JEFS.

Download Presentation

Joint Ensemble Forecast System (JEFS) for DoD Operations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Joint Ensemble Forecast System(JEFS) Sep 2005 NCAR

  2. Overview • Motivation/Goal • Requirements • Resources • System Design • Roadmap • Products/Applications

  3. JEFS’ Goal Deterministic Forecasting Ensemble Forecasting ? …etc • Ignores forecast uncertainty • Potentially very misleading • Oversells forecast capability • Reveals forecast uncertainty • Yields probabilistic information • Enables optimal decision making Prove the value, utility, and operational feasibility of ensemble forecasting to DoD operations.

  4. Ensemble Forecast RequirementsAir Force (and Army) AFW Strategic Plan and Vision, FY2008-2032 Issue #3/4-3: Use of multi-scale (kilometer to meter resolution), ensemble, and consensus model forecasts, combined with automation of local techniques, to support planning and execution of military operations. “Ensembles have the potential to help quantify the certainty of a prediction, which is something that users have been interested in for years. The military applications of ensemble forecasting are only at their beginnings; there are years’ worth of research waiting to be done.” Operational Requirements Document, USAF 003-94-I/II/III-D, Centralized Aerospace Weather Capability(CAWC ORD) …will support ensemble forecasting with the following capabilities: 1) The creation of sets of perturbed initial conditions of the fine-scale model initialized fields in selected regional windows. 2) Assembly of ensemble forecasts either from model output sets derived from the multiple sets of perturbed initial conditions or from sets assembled from the output from different models. 3) Evaluation of forecasting skill of ensemble forecasts compared to single forecast model outputs. Air Force Weather, FY 06-30, Mission Area Plan (AFW MAP) Deficiency: Mesoscale Ensemble Forecasting “The key to successful ensemble forecasting is many different realizations of the same forecast events. Studies using different models - or the same model with different configurations - consistently yield better overall forecasts. This demonstrates a definite need for multiple model runs.” R&D Portfolio MSA Shortfall D-08-07K: Insufficient ensemble forecasting capability for AFWA’s theater scale model

  5. Ensemble Forecast RequirementsNavy • No documented requirement or supporting Fleet request for ensemble prediction. • Navy ‘requirements’ are written in terms of warfighting capabilities. The current (draft) METOC ICD (old MNS) only specifies parameters required for support.However, ensembles present a solution for the following specified warfighter requirements: • Long-range prediction for mission planning, optimum track ship routing, severe weather avoidance • Tropical cyclone prediction for safety of operations, personnel safety • Winds, turbulence, boundary layer structure for chem/bio/nuclear dispersion (WMD support) • Cloud base, fog, aerosol for slant range visibility (aerial recon, flight operations, targeting) • Boundary layer structure/atmospheric refractivity (T, q) for EM propagation (detection, tracking, communications) • Surface winds (ASW, mine drift, SAR, flight operations in enclosed/narrow waterways) • Surf and sea heights (SOF, small boat ops, logistics) • Turbulence, cloud base/tops (OPARS, safety of flight) • Whenever the uncertainty of the wx phenomena exceeds operational sensitivity, either a reliable probabilistic or a range-of-variability prediction is required.

  6. J E F S T E A M & AFIT

  7. FY04 HPCMP Distributed Center (DC) Award • Apr 03: FNMOC and AFWA proposed a split distributed center to the DoD High Performance Computing Modernization Program (HPCMP) as a DoD Joint Operational Test Bed for the Weather Research and Forecast (WRF) modeling framework • Apr 04: Installation began of $4.2M in IBM HPC hardware, • split equally between FNMOC and AFWA • (two 96 processor IBM Cluster 1600 p655+ systems) • Fosters significant Navy/Air Force collaboration in NWP for • 1) Testing and optimizing of WRF configurations to meet • unique Navy and Air Force NWP requirements • 2) Developing and testing mesoscale ensembles based on • multiple WRF configurations to meet DoD needs • 3) Testing of Grid Computing concepts and tools for NWP • Apr 08: Project Completion

  8. Joint Global Ensemble (JGE) • Description: Combination of current GFS and NOGAPS global, medium-range • ensemble data. Possible expansion to include ensembles from CMC, • UKMET, JMA, etc. • Initial Conditions: Breeding of Growing Modes 1 • Model Variations/Perturbations: Two unique models, but no model perturbations • Model Window: Global • Grid Spacing: 1.0 1.0 (~80 km) • Number of Members: 40 at 00Z • 30 at 12Z • Forecast Length/Interval: 10 days/12 hours • Timing • Cycle Times: 00Z and 12Z • Products by: 07Z and 19Z 1Toth, Zoltan, and Eugenia Kalnay, 1997: Ensemble Forecasting at NCEP and the Breeding Method. Monthly Weather Review: Vol. 125, No. 12, pp. 3297–3319.

  9. Joint Mesoscale Ensemble (JME) 5km 15km • Description: Multiple high resolution, mesoscale model runs generated at FNMOC • and AFWA • Initial Conditions: Ensemble Transform Filter2run on short-range (6-h), • mesoscale data assimilation cycle driven by GFS and NOGAPS • ensemble members • Model variations/perturbations: • Multimodel: WRF-ARW, COAMPS • Varied-model: various configurations of physics packages • Perturbed-model: randomly perturbed sfc boundary conditions (e.g., SST) • Model Window: East Asia (COPC directive, Apr ’04) • Grid Spacing: 15 km for baseline JME (summer ’06) • 5 km nest later in project • Number of Members: 30 (15 run at each DC site) • Forecast Length/Interval: 60 hours/3 hours • Timing • Cycle Times: 06Z and 18Z • Products by: 14Z and 02Z ~7h production /cycle 2Wang, Xuguang, and Craig H. Bishop, 2003: A Comparison of Breeding and Ensemble Transform Kalman Filter Ensemble Forecast Schemes. Journal of the Atmospheric Sciences: Vol. 60, No. 9, pp. 1140–1158.

  10. Storage of principal fields FNMOC Medium Range Ensemble  18 00Z, 8 12Z NOGAPS, T119, 10 d  Analysis Perturbations: Bred Modes  Model Perturbations: None Calibrate Joint Global Ensemble (JGE) Products  Apply postprocessing calibration  Long-range products tailored to support warfighter planning Joint Ensemble Forecast System FNMOC AFWA NCEP Medium Range Ensemble  44 staggered GFS runs, T126, 15 d  Analysis perturbations: Bred Modes  Model Perturbations: in design lateral boundary conditions multiple first guesses Observations Data Assimilation 3DVAR / NAVDAS Ensemble Transform Generate initial condition perturbations Observations and Analyses Storage of principal fields “warm start” Calibrate • Joint Mesoscale Ensemble (JME)  30 members, 15/5km, 60 h, 2/day  One “demonstration” theater • Multi model (WRF, COAMPS) • Perturbed model: varied physics and surface boundary conditions • JME Products •  Apply postprocessing calibration • Short-range products tailored to support warfighter operations

  11. JEFS Production Schedule 00Z cycle data 06Z cycle data 12Z cycle data 18Z cycle data GFS ensemble Grids to AFWA and FNMOC NOGAPS ens. grids to AFWA Interpolate and calibrate JGE Make/Distribute JGE products Obtain global analysis Update JGE Calibration Data Assimilation Run 6-h forecasts and do ET Run JME models Exchange output Make/Distribute JME Products Update JME Calibration 06Z production cycle 18Z production cycle 00 03 06 09 12 15 18 21 24(Z)

  12. Notional Roadmapfor JEFS and Beyond • AFWA/FNMOC Awarded HPCMPO DC Nov 03 • AFWA Awarded PET-CWO On-Site • NRL Awarded mesoscale ensemble research • DTRA-AFWA Ensemble Investment • ARL SIBR Phase I & II and AFWA UFR • NCAR & UW Contract, funded by AFWA Wx Fcst 3600 1 1. HPCMPO DC H/W 2 2. Programming Environment and Training - Climate Weather Ocean On-Site 3 3. Probabilistic Pred. of High Impact Wx 4 4. DTRA-AFWA Support 5. Phase I 5. ARL SIBR Phase II w/ AFWA UFR 6. NCAR & UW Contract Phase I Phase II JEFS Design JGE IOC JGE RDT&E JME RDT&E 1st Meso. EPS H/W Procurement* Mesoscale EPSIOC 2ndMeso. EPSH/W Procurement* 3rdMeso. EPSH/W Procurement* Mesoscale EPS FOC FY04 FY05 FY06 FY07 FY08 FY09 FY10 FY11 * Note: Funded via PEC 35111F Weather Forecasting (3080M)

  13. Product Strategy Tailor products to customers’ needs and weather sensitivities Forecaster Products/Applications  Designto help transition from deterministic to stochastic thinking Warfighter Products/Applications  Design to aid critical decision making (Operational Risk Management)

  14. Operational Testing & Evaluation PACIFIC AIR FORCES Forecasters 20th Operational Weather Squadron 17th Operational Weather Squadron 607 Weather Squadron Warfighters PACAF 5th Air Force Naval Pacific Meteorological andOceanographic Center Forecasters Yokosuka Navy Base Warfighters 7th Fleet SEVENTH Fleet FIFTH Air Force

  15. Forecaster Products/Applications

  16. Consensus & Confidence Plot Maximum Potential Error (mb, +/-) 6 5 4 3 2 1 <1 • Consensus (isopleths): shows “best guess” forecast (ensemble mean or median) • Model Confidence (shaded) • Increase Spread in Less Decreased confidence • the multiple forecasts Predictability in forecast

  17. Probability Plot % • Probability of occurrence of any weather phenomenon/threshold (i.e., sfc wnds > 25 kt) • Clearly shows where uncertainty can be exploited in decision making • Can be tailored to critical sensitivities, or interactive (as in IGRADS on JAAWIN)

  18. Multimeteogram Current Deterministic Meteogram • Show the range of possibilities for all meteogram-type variables • Box & whisker, or confidence interval plot is more appropriate for large ensembles • Excellent tool for point forecasting (deterministic or stochastic)

  19. Sample JME Products Probability of Warning Criteria at Osan AB When is a warning required? What is the potential risk to the mission? Valid Time (Z) Surface Wind Speed at Misawa AB Extreme Max Requires paradigm shift into “stochastic thinking” Mean 90% CI Extreme Min 11/18 12/00 06 12 18 13/00 06 12 18 14/00 06 Valid Time (Z)

  20. Sample JGE Product (Forecaster) Probability of Severe Turbulence @FL300 10% 30% 50% 70% 90% 70% 10% 50% 30% 10%

  21. Sample JGE Product? (Warfighter) Upper Level Turbulence 280 350

  22. Sample JGE Product (Warfighter) Chance of Upper Level Turbulence Intensity: Severe 250/370 280/370 LEGEND 300/330 Negligible Chance Low Base/Top Med High

  23. Warfighter Products/Applications

  24. Bridging the Gap Integrated Weather Effects Decision Aid (IWEDA) Deterministic Forecast Weapon System Weather Thresholds* Stochastic Forecast Probabilistic IWEDA -- for Operational Risk Management (ORM) > 13kt 10-13kt 0-9kt 10% Drop Zone Surface Winds 6kt Drop Zone Surface Winds 6kt 20% 70% 3 6 9 12 15 18kt *AFI 13-217 Stochastic Forecast Binary Decisions/Actions AR Route Clear & 7 Go / No Go T-Storm Within 5 ? IFR / VFR GPS Scintillation Bombs on Target Crosswinds In / Out of Limits Flight Hazards

  25. Method #1: Decision Theory Optimal Threshold = 15% • Minimize operating cost (or maximize effectiveness) in the long run by taking action based on an optimal threshold of probability, rather than an event threshold. • What is the cost of taking action? • What is the loss if… • the event occurs and without protection? • opportunity was missed since action was not taken? • Good for well defined, commonly occurring events Example (Hypothetical) Event: Damage to parked aircraft Threshold: sfc wind > 50kt Cost (of protecting): $150K Loss (if damaged): $1M

  26. Method #2:Weather Risk Analysis and Portrayal (WRAP) • Army Research Lab’s stochastic decision aid, in development by Next Century Corporation • Stoplight color based on 1) Ensemble forecast probability distribution 2) Weapon systems’ operating thresholds 3) Warfighter-determined level of acceptable risk 9kt Threshold 13kt Threshold 90% Confidence 80% 70% The greater the confidence required (i.e., less acceptable risk), the less certain we can be of the desired outcome. Cumulative Probability 5 10 15 Drop Zone Surface Winds (kt)

  27. Method #2:Weather Risk Analysis and Portrayal (WRAP) 9kt Threshold 13kt Threshold 18kt ? Threshold Acceptable Risk Decision Input Drop Zone #1 Low Med High Low Med High Low Med High 99% 1% 0% (90th Percentile) (60th Percentile) (30th Percentile) 5 10 15 Drop Zone #2 1% 31% 68% 5 10 15 Drop Zone #3 37% 52% 11% 5 10 15 Surface Winds (kt)

  28. Method #2:Weather Risk Analysis and Portrayal (WRAP)

  29. ENSEMBLES AHEAD JEFS

  30. Backup Slides

  31. To account for this effect, we can make an ensemble of predictions (each forecast being a likely outcome) to encompass the truth. The Atmosphere is a Chaotic, Dynamic System Analogy Two adjacent drops in a waterfall end up very far apart. Predictability is primarily limited by errors in the analysis Sensitive to Initial Conditions: nearby solutionsdiverge Describable State: system specified by set of variables that evolve in “phase space” Deterministic: system appears random but process is governed by rules Solution Attractor: Limited region in phase space where solutions occur Aperiodic: Solutions never repeat exactly, but may appear similar

  32. Encompassing Forecast Uncertainty An analysis produced to run a model is somewhere in a cloud of likely states. Any point in the cloud is equally likely to be the truth. 48h forecast T T T T T 24h forecast 12h forecast 36h forecast 12h verification 24h verification The true state of the atmosphere exists as a single point in phase space that we never know exactly. 36h verification 48h verification P H A S S P E A C E Nonlinearities drive apart the forecast trajectory and true trajectory (i.e., Chaos Theory) A point in phase space completely describes an instantaneous state of the atmosphere. (pres, temp, etc. at all points at one time.)

  33. Encompassing Forecast Uncertainty T T Analysis Region P H A S S P E A 48h forecast Region C E An ensemble of likely analyses leads to an ensemble of likely forecasts Ensemble Forecasting: -- Encompasses truth -- Reveals uncertainty -- Yields probabilistic information

  34. Verification Eta-MM5 Forecast Mean Sea Level Pressure(mb) and shaded Surface Wind Speed (m s-1) The Wind Storm That Wasn’t (Thanksgiving Day 2001)

  35. eta-MM5 Forecast cent-MM5 Forecast ngps-MM5 Forecast ngps-MM5 Forecast cmcg-MM5 Forecast ukmo-MM5 Forecast cmcg-MM5 Forecast ukmo-MM5 Forecast tcwb-MM5 Forecast avn-MM5 Forecast avn-MM5 Forecast tcwb-MM5 Forecast The Wind Storm That Wasn’t (Thanksgiving Day 2001) eta-MM5 Forecast Verification

  36. Deterministic Forecasting Single solution Variable and unknown risk Attempt to minimize uncertainty Utility reliant on: 1) Accuracy of analysis 2) Accuracy of model 3) Flow of the day 4) Forecaster experience 5) Random chance Cost / Return: Mod / Mod Deterministic vs. Ensemble Forecasting Ensemble Forecasting Multiple solutions Variable and known risk Attempt to define uncertainty Utility reliant on: 1) Accounting of analysis error 2) Accounting of model error 3) Flow of the day 4) Machine-to-Machine 5) Random sampling (# of model runs) Cost / Return: High / High+

  37. The Deterministic Pitfall Reality Notion The deterministic atmosphere should be modeled deterministically. A high resolution forecast is better. A single solution is easier for interpretation and forecasting. The customer needs a single forecast to make a decision. A single solution is more affordable to process. NWP was designed deterministically. There are many spectacular success stories of deterministic forecasting Need for stochastic forecasting is a result of the sensitivity to initial conditions. A better looking simulation is not necessarily a better forecast. (precision ≠ accuracy) Misleading and incomplete view of the future state of the atmosphere. Poor support to the customer since in many cases, a reliable Y/N forecast is not possible. Good argument in the past, but not anymore. How can you afford not to do ensembles? Yes and no. NWP founders designed models for deterministic use, but knew the limitation. Result of forecast situation with low uncertainty, or dumb luck of random sampling.

  38. Method #1: Decision Theory Optimal Threshold = 45% • Minimize operating cost (or maximize effectiveness) in the long run by taking action based on an optimal threshold of probability, rather than an event threshold. • What is the cost of taking action? • What is the loss if… • the event occurs and without protection? • opportunity was missed since action was not taken? • Good for well defined, commonly occurring events Example (Hypothetical) Event: Satellite drag alters LEO orbits Threshold: Ap > 100 Cost (of preparing): $4.5K Loss (of reacting): $10K

  39. EF Vision 2020 AFWA Microscale Ensembles Runs/Cycle: O(10) Resolution: O(100m) Length: 24 hours Global Mesoscale Ensemble Runs/Cycle: O(10) Resolution: O(10km) Length: 10 days United Global Mesoscale Ensemble Runs/Cycle: O(100) Resolution: O(10km) Length: 10 days Microscale Ensemble Runs/Cycle: O(10) Resolution: O(100m) Length: 2 days Global Mesoscale Ensemble Runs/Cycle: O(10) Resolution: O(10km) Length: 15 days Coalition Weather Centers Global Mesoscale Ensembles …etc. Microscale Ensembles Runs/Cycle: O(10) Resolution: O(100m) Length: 24 hours Global Mesoscale Ensemble Runs/Cycle: O(10) Resolution: O(10km) Length: 10 days MSC JMA ABM FNMOC

More Related