1 / 118

Forecast Process: Data Collection, Quality Control, Assimilation, Model Integration & Post-Processing

Learn about the major steps in the forecast process, including data collection, quality control, data assimilation, model integration, and post-processing. Discover how weather observations are collected from various sources, how quality control systems ensure accurate data, and how models and observations are combined to create forecasts. Explore the different data assimilation techniques used, including objective analysis, 3DVAR, 4DVAR, and ensemble-based approaches. Gain insights into the challenges and advancements in forecast modeling.

holahan
Download Presentation

Forecast Process: Data Collection, Quality Control, Assimilation, Model Integration & Post-Processing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 452 NWP2018

  2. Major Steps in the Forecast Process • Data Collection • Quality Control • Data Assimilation • Model Integration • Post Processing of Model Forecasts • Human Interpretation (sometimes) • Product and graphics generation

  3. Data Collection • Weather is observed throughout the world and the data is distributed in real time. • Many types of data and networks, including: • Surface observations from many sources • Radiosondes and radar profilers • Fixed and drifting buoys • Ship observations • Aircraft observations • Satellite soundings • Cloud and water vapor track winds • Radar and satellite imagery

  4. Observation and Data Collection

  5. Atmospheric Moisture Vectors

  6. Weather Satellites Are Now 99% of the Data Assets Used for NWP • Geostationary Satellites: Imagery, soundings, cloud and water vapor winds • Polar Orbiter Satellites: Imagery, soundings, many wavelengths • RO (GPS) satellites • Scatterometers • Active radars in space (GPM)

  7. Quality Control • Automated algorithms and manual intervention to detect, correct, and remove errors in observed data. • Examples: • Range check • Buddy check • Comparison to first guess fields from previous model run • Hydrostatic and vertical consistency checks for soundings. • A very important issue for a forecaster--sometimes good data is rejected and vice versa.

  8. Eta 48 hr SLP Forecast valid 00 UTC 3 March 1999 3 March 1999: Forecast a snowstorm … got a windstorm instead

  9. Pacific Analysis At 4 PM 18 November 2003 Bad Observation

  10. Forecaster Involvement • A good forecast is on the lookout for NWP systems rejecting bad data, particularly in data sparse areas. • Quality control systems can allow models to go off to never never land. • Less of a problem today due to satellite data everywhere.

  11. Objective Analysis/Data Assimilation • Observations are scattered in three dimensions • Numerical weather models are generally solved on a three-dimensional grid • Need to interpolate observations to grid points and to ensure that the various fields are consistent and physically plausible (e.g., most of the atmosphere in hydrostatic and gradient wind balance).

  12. Objective Analysis • Interpolation of observational data to either a grid (most often!) or some basis function (e.g., spectral components) • Typically iterative (done in several passes)

  13. Objective Analysis/Data Assimilation • Often starts with a “first guess”, usually the gridded forecast from an earlier run (frequently a run starting 6 hr earlier) • This first guess is then modified by the observations. • Adjustments are made to insure proper balance. • Often iterative

  14. An early objective analysis scheme is the Cressman scheme

  15. 3DVAR: 3D Variational Data Assimilation • Used by the National Weather Service today for the GFS and NAM (called GSI) • Tries to create an analysis that minimizes a cost function dependent on the difference between the analysis and (1) first guess and (2) observations • Does this at a single time.

  16. 3DVAR Covariances: Spreads Error in Space

  17. 4DVAR: Four Dimension Variational Data Assimilation • Tries to optimize analyses at MULTIPLE TIMES • Tries to duplicate the observed evolution over time as well as the situation at initialization time. • Uses the model itself as a data assimilation too.

  18. 4DVAR Components • Full non-linear model • Tangent linear version of the full model (linearized version of the forecast model) • Adjoint of the tangent linear model -which allows one to integrate the model backwards. Tells sensitivity of final state to the initial state.

  19. 4DVAR • Typical runs the model back and forth during an initialization period (6-12 hr), roughly ten times. • Substantial computational cost. • Need to have adjoint and TL version of the model. • Currently used by ECMWF, CMC, UKMET, and US Navy. NOT NCEP.

  20. Many of the next generation data assimilation approaches are ensemble based • Example: the Ensemble Kalman Filter (EnKF)

  21. Mesoscale Covariances 12 Z January 24, 2004 Camano Island Radar |V950|-qr covariance

  22. Surface Pressure Covariance Land Ocean

  23. An Attractive Option: EnKF Temperature observation 3DVAR EnKF

  24. Hybrid Data Assimilation: Now Used in GFS • Uses both 3DVAR and EnkF • Uses EnkF covariances from GFS ensemble in 3DVAR.

  25. Next Advance ENVAR • Use temporal covariances to spread impact of observations over TIME. • Now operational. • Has some of the properties of 4DVAR (adjusts model evolution)

  26. Grid Point Models • Horizontal (and vertical) variations describd on a 3-D grid • Computer resources needed increase by roughly 8 times for doubling resolution • Can have computational instability, particularly when time step is too long for the grid spacing used. • CFL stability criterion: • Cdt/dx <=1

  27. Galerkin Approach (e.g., spectral) • Represents dependent variables (e.g., u, v, T) as a sum of basis functions. • Fourier analysis is an example: • http://www.falstad.com/fourier/index.html

  28. GFS and ECMWF Use Spherical Harmonics to Represent Variation in Horizontal Spherical Harmonics Legendre Polynomials

  29. Vertical Coordinate Systems • Originally p and z: but they had a problem…Boundary conditions (BC) when the grid hit terrain! • Then sigma p and sigma z, theta • Increasingly use of hybrids– e.g., sigma-theta, sigma-p

  30. Sigma

  31. Sigma-Theta

  32. New Sigma-P in WRF

  33. Nesting

  34. Why Nesting? • Could run a model over the whole globe, but that would require large amounts of computational resource, particularly if done at high resolution. • Alternative is to only use high resolution where you need it…nesting is one approach. • In nesting, a small higher resolution domain is embedded with a larger, lower-resolution domain.

  35. Nesting • Can be one-way or two way. • In the future, there will be adaptive nests that will put more resolution where it is needed. • And instead of rectangular grids, other shapes can be used.

  36. Next Generation Global Models • Will use different geometries

  37. MPAS: Hexagonal Shapes

  38. MPAS

  39. FV3-Replacement for GFS

More Related