global systems division contributions to warn on forecast
Skip this Video
Download Presentation
Global Systems Division contributions to Warn-on-Forecast

Loading in 2 Seconds...

play fullscreen
1 / 24

Global Systems Division contributions to Warn-on-Forecast - PowerPoint PPT Presentation

  • Uploaded on

Global Systems Division contributions to Warn-on-Forecast. Steve Koch Director, ESRL Global Systems Division. February 18, 2010. Topics and Tasks. Best approaches to radar data assimilation Storm-scale ensemble predictability studies MADIS Metadata and QC improvements

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

PowerPoint Slideshow about 'Global Systems Division contributions to Warn-on-Forecast' - dyllis

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
global systems division contributions to warn on forecast

Global Systems Division contributions to Warn-on-Forecast

Steve Koch

Director, ESRL Global Systems Division

February 18, 2010

topics and tasks
Topics and Tasks
  • Best approaches to radar data assimilation
  • Storm-scale ensemble predictability studies
  • MADIS Metadata and QC improvements
  • Use of WoF information in NWS operations
current operational model 13 km rapid update cycle ruc
Current Operational Model: 13-km Rapid Update Cycle (RUC)
  • Provides hourly mesoscale analyses using available observations and short-range model forecasts at Δx=13 km
  • Focus is on aviation and surface weather:
    • Thunderstorms, severe weather
    • Icing, ceiling and visibility, turbulence
    • Detailed surface temperature, dewpoint, winds
    • Upper-level winds
  • Users:
    • Aviation/transportation
    • Severe weather forecasting
    • General public forecasting





Function of DFI: Remove high-frequency oscillations (particularly, gravity waves) from the initial state for the forecast


The Future:

3-km High-Resolution Rapid Refresh (HRRR)

why enkf development at esrl
Why EnKF development at ESRL?
  • Flow-dependent background error covariance information – particularly important for non-uniform flows (supercell)
  • Complementary development path to NCEP 4DVar development, with much less complexity (easy to code, very portable, no need for tangent linear-adjoint model)
  • Provides automatic ensemble of initial conditions for ensemble forecast applications
  • ESRL PSD & GSD collaborated in 2009 in developing and testing EnKF for the GSD Finite Volume Icosahedral Model (FIM) for use in HFIP hurricane modeling exercise. This success forms the basis for our WoF EnKF work.
modeling and data assimilation tasks
Modeling and Data Assimilation Tasks
  • HRRR initial and boundary conditions to form the “backbone” for experimental convective (~1-km) model ensemble. HRRR fields will be produced at 15-min output frequency in 0–3 h windows.
  • Direct application of radar DFI technique to 3-km HRRR to replace current use at 13 km resolution, and at multiple radar times
  • Complete initial design for WoF HRRR with Ensemble Kalman Filter data assimilation at storm-scale – research to be topic for new hire
  • Replace current method, whereby DFI is performed outside of and after the GSI 3DVar analysis (which includes clouds and hydrometeors) by one where DFI is fully coupled to the GSI
  • Longer term: work towards merged NAM/Rapid Refresh system – a 6-member ensemble system to be run over the large NAM domain at 13 km with hourly updates to 24h – the NARRE (North American Rapid Refresh Ensemble). Goal is 2013 implementation at NCEP.
ensemble test bed and evaluation activities at the dtc supporting wof
Ensemble Test Bed and Evaluation Activities at the DTC supporting WoF
  • DTC Ensemble Testbed (DET) – led by GSD:
    • Will provide an environment in which extensive testing and evaluation of ensemble techniques can be conducted, with the results made relevant to NCEP and AFWA
    • Develop modular infrastructure
      • Optimized ensemble configuration design
      • Provide ability to represent uncertainty in initial conditions and models
      • Statistical post-processing (calibration, debiasing, downscaling)
      • Verification (see below) and product generation
  • Objective evaluation of the experimental forecasts (MET):
    • DTC to continue to provide real-time MET evaluation support
    • Add new ensemble evaluation methods to MET for 4-km, 20-member CAPS Storm-Scale Ensemble Forecast
    • Include interactive analysis and plotting (METview)
    • Perform retrospective forecast evaluations
attributes diagrams
Attributes diagrams

Four IOPs: IOP1, 4, 10, 12

Models: 2 WRF-ARW (Thompson and Ferrier), MM5 (Schultz), RAMS


over the American River Basin

Reliability curves and the Brier skill score improved.

Internal frequency histograms changed.

Error bars: 90% confidence intervals

computational reality check
Computational Reality Check

NOAA Must Make HPC a Top Priority Investment !

Innovate or become obsolete …

NOAA’s ability to meet its mission via HPC is falling further behind by any measure. The science will go where there is computing capability to advance it.

GSD is researching Graphical Processor Units (GPU)


Meteorological Assimilation and Data Ingest System (MADIS)

Surface Data Density Before MADIS

Surface Data Density After MADIS

  • Data Portfolio:
    • 50,007 Surface stations producing over 11,600,000 observations/day
    • 134 Profiler Sites (> 200,000 observations/day)
    • Over 450,000 aircraft observations/day
    • Plus global radiosonde and satellite observations
  • Access type:
    • How: Local Data Monitor (LDM) Pull, FTP (push and pull), XML (pull)
    • Who: NOAA, Federal Agencies, Private Sector, Universities
  • Integration:
    • MADIS generates products in standardized text-based format to facilitate inter-comparisons and access
  • Quality control:
    • What is provided: Automated QC Checks, e.g., Gross Error Checks, Temporal Consistency and Spatial Consistency (e.g. buddy-checks); Station Monitoring Statistics such as Frequency of Failure, Bias and Standard Deviation Statistics
    • What are its limitations/scope: Basis of statistics is analysis field based on elevation corrected station buddy-checks; therefore, analysis is a good as the stations that comprise it (generally good). Model (HRRR) background is not used in MADIS QC!
  • Where does MADIS run?
    • Pre-IOC: Primary @ GSD ; Physical Backup: NA
    • Post-IOC: Primary @ Gateway/NCO, Physical Backup: GSD
it architecture

MADIS Computing Environment







and Quality Control

IT Architecture


Port the existing GSD MADIS software to an integrated NWS TOC and NCO distributed environment, with a supporting backup and research-to-operation test environment at GSD.

  • Initial Operating Capability (IOC): June 2010
  • Final Operating Capability (FOC): June 2011



Current Stations + UrbaNet + ASOS/AWOS + APRSWXNET + AWS with 5-minute Data by 2011 – 14,574

Blue – current Red – UrbaNet Brown – ASOS/AWOS Black – APRSWXNET and AWS


Despite the tremendous number of sites being ingested, QC’d, and distributed through MADIS, the data are still largely distributed like “oases and deserts”. Adaptive multi-scale analysis techniques that utilize the temporal information (GSD STMAS multi-grid 3Dvar) are required.

Challenge: Non-uniform data distribution

madis tasks 2010 only
MADIS Tasks (2010 only)
  • Problem: currently the RUC GSI 3DVar uses ~8,000 temperature and dew point observations and ~4,000 wind observations from hourly mesonet data. Why not 50,000? Unacceptable bias errors.
  • Solution: With WoF funding, establish a comprehensive metadata database to enable effective utilization and integration of the data in model data assimilation.
  • Provide information about the fit of the observations to the Rapid Refresh and HRRR model background fields (needed for DA).
  • Incorporate these statistics into the NWS National Mesonet metadata database along with station and instrumentation information.
  • HRRR plans to use these metadata in 2011 to form “observation use lists” for use in the HRRR DA.
service proving ground motivation
Service Proving Ground Motivation
  • WoF/HWT needs to:
    • Understand how best to utilize probabilistic hazard information
    • Understand how the publicS use and respond to warnings
    • Educate people about the new warning guidance to be provided
  • GSD approach:
    • Iteratively explore, develop, and evaluate new functionality that shows promise of significantly benefiting operational weather forecast offices, SPC, and other users of weather information – i.e., a Services Proving Ground (SPG)
  • NWS constraints:
    • Must be built on AWIPS II and NextGen architecture for collaboration, data sharing, common software development
    • Thereby extends AWIPS into a “system of systems”
service proving ground approach
Service Proving Ground Approach
  • The SPG is a holistic, end-to-end approach that will require new ways of doing Research To Operations:
    • Involve stakeholders at concept development
    • During prototyping
    • Engaged in systematic testing & evaluation
    • Through implementation into operations
  • Our SPG motto: “Build a little, test a little, rework it a little”
  • Bring developers, researchers, weather forecasters, emergency responders, media, social scientists together on an equal playing field – stakeholder workshops (e.g. the Next Generation Warnings Workshop conducted at GSD with a wide variety of stakeholders)
service proving ground tasks 1
Service Proving Ground Tasks - 1
  • The key to success: early documentation of the current warning process (as FSL did in early stages of AWIPS). Collect warning and verification information from selected WFOs. Identify gaps in current methods and applications.
  • Prototype a framework at GSD with components sufficient to perform a Displaced Real-Time (DRT) evaluation, jointly with NWSFO-Norman and SPC:
    • Two groups of forecasters are presented with (a) conventional current datasets and products via N-AWIPS and (b) additional information from experimental ensemble forecasts using enhanced products and displays
    • Each group swaps roles during the DRT exercise.
    • FY 2010: Prepare DRT datasets, procure needed HW/SW components, and stand up the test environment
    • FY 2011: Conduct a formalized evaluation.
service proving ground tasks 2
Service Proving Ground Tasks - 2
  • AWIPS Integrated Hazards Information System (IHIS) – the future replacement for WarnGen – also to be evaluated. Needs not being currently met with WarnGen:
    • Integrated with other warning tools on AWIPS II
    • Users downstream of warning polygon may get no information
    • Forecaster ability to provide storm motion uncertainty and nonlinear storm motion (e.g., sudden right-moving behavior)
    • Lacking TOA/TOD information
    • Adaptable weather information (not just products) tailor-made to action response by user and with adaptable warning thresholds