1 / 46

Performance Characteristics of a Pseudo-operational Ensemble Kalman Filter

Performance Characteristics of a Pseudo-operational Ensemble Kalman Filter. Greg Hakim & Ryan Torn University of Washington http://www.atmos.washington.edu/~hakim. April 2006, EnKF Wildflower Meeting. Outline. Issues for limited-area EnKFs. Boundary conditions. Nesting.

gladys
Download Presentation

Performance Characteristics of a Pseudo-operational Ensemble Kalman Filter

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Performance Characteristics of a Pseudo-operational Ensemble Kalman Filter Greg Hakim & Ryan Torn University of Washington http://www.atmos.washington.edu/~hakim April 2006, EnKF Wildflower Meeting

  2. Outline • Issues for limited-area EnKFs. • Boundary conditions. • Nesting. • [Multiscale prior covariance.] • UW pseudo-operational system. • Performance characteristics. • Analysis of Record (AOR) test. • Experiments using the UW RT data. • Sensitivity & targeting. • Observation impact & thinning.

  3. Boundary Conditions • Obvious choice: global ensemble, but… • Often ensembles too small. • Undesirable ensemble population techniques. • Different resolution, grids, etc. • Flexible alternatives (Torn et al. 2006). • Mean + random draws from N(0,B). • Mean + scaled random draws from climatology. • “error boundary layer” shallow due to obs.

  4. 1 2 Nesting • Grid 1: global ensemble BCs. • E.g. draws from N(0,B) or similar. • Grid 2: ensemble BCs from grid 1. • One-way nesting: straightforward. • Cycle on grid 1, then on grid 2. • Two-way: many choices; little experience. • Note: Hxb different on grids 1 and 2. • Issues at grid boundaries.

  5. The Multiscale Problem • Sampling error • noise in obs est & prior covariance. • Ad hoc remedies • “localization” • Confidence intervals. • Multiscale problem. • Noise on smallest scales may dominate. • Need for scale-selective update?

  6. Surface Temperature Covariance

  7. Mesoscale Example: cov(|V|, qrain)

  8. Real Time Data Assimilation at the University of Washington

  9. Objectives of System • Evaluate EnKF in a region of sparse in-situ observations and complex topography. • Estimate analysis & forecast error. • Sensitivity: targeting & thinning.

  10. Model Specifics • WRF Model, 45 km resolution, 33 vertical levels • 90 ensemble members • 6 hour analysis cycle • ensemble forecasts to t+24 hrs at 00 and 12 UTC • perturbed boundaries using fixed covariance perturbations from WRF 3D-VAR

  11. Observations

  12. Probabilistic Analyses 500 hPa height sea-level pressure Large uncertainty associated with shortwave approaching in NW flow

  13. Microphysical Analyses 20 February 2005, 00 UTC model analysis composite radar

  14. Ensemble Forecasts Analysis 24-hour forecast

  15. Verification

  16. Temperature Verification 12 hour forecast 24 hour forecast UW EnKFGFSCMCUKMO NOGAPS ECMWF

  17. U-Wind Verification 12 hour forecast 24 hour forecast UW EnKFGFSCMCUKMO NOGAPS ECMWF

  18. Moisture Verification (Td) 12 hour forecast 24 hour forecast UW EnKFGFSCMCUKMO NOGAPS ECMWF

  19. No Assimilation Verification Winds Temperature UW EnKFNo Observations Assimilated

  20. Moving Toward the Mesoscale

  21. Analysis of Record Hourly surface analyses. EnKF covariances. Available t+30 minutes. 15 km resolution.

  22. Sensitivity Analysis • Basic premise: • how do forecasts respond to changes in initial & boundary conditions, & the model? • Applications: • “targeted observations” & network design. • “targeted state estimation” (thinning). • basic dynamics research.

  23. Adjoint approach Given J, a scalar forecast metric, one can show that: adjoint of resolvant • Need to run an adjoint model backward in time. • Complex code & lots of approximations • Does not account for state estimation or errors.

  24. Ensemble Approach • Adjoint sensitivity weighted by initial-time error covariance. • Can evaluate rapidly without an adjoint model! • Can show: this gives response in J, including state estimation. With Brian Ancell (UW)

  25. Sensitivity from the UW Real-time system Case study removing one observation. Metric: average MSL pressure over western WA

  26. Sensitivity Demonstration How would a forecast change if buoy 46036 were removed?

  27. Overview of Case

  28. Overview of Case

  29. Overview of Case

  30. Overview of Case

  31. Overview of Case

  32. 12 UTC 5 Feb Sensitivity Sea-level pressure 850 hPa temperature

  33. 12 UTC 5 Feb. Analysis Change Analysis Change Forecast Sensitivity

  34. Forecast Differences • Assimilating the surface pressure observation at buoy 46036 leads to a stronger cyclone. • Predicted Response: 0.63 hPa • Actual Response: 0.60 hPa

  35. Summary of 10 Cases

  36. Observation Impact Adaptively sampling the obs datastream Thin by assimilating only high-impact obs.

  37. Observations Ranked by Impact

  38. Ob-Type Contributions to Metric

  39. Metric Prediction Verification

  40. Summary • BCs: flexibility & weak influence. • UW real-time system ~gov. center quality. • Moisture field better than most. • Surface AOR ~10 km. • Sensitivity analysis. • Ensemble targeting easy & flexible. • Adaptive DA (“thinning”).

  41. AOR Opportunities “No propagate” update nested high resolution single member. assimilate using coarse-grid stats. can be done “now.” Deterministic propagation as above, but evolve high-res state. Full filter evolve & assimilate entire ensemble. 4DVAR with EnKF statistics. at least 3--5 years out.

  42. AOR Challenges • True multiscale conditions (<15 km). • Scale-dependent sampling errors? • Bias estimation and removal. • EnKF allows state-dependent bias estimation. • Model errorestimation & removal. • Parameter estimation; model calibration. • Satellite radiance assimilation. • Kalman smoothing.

  43. Surface Obs. and Rawindsondes

  44. Observation Densities aircraft obs. cloud winds

  45. Ensemble inliers/outliers inlier outlier

More Related