1 / 18

Outline

justis
Download Presentation

Outline

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Applications of optimal control and EnKF to Flow Simulation and ModelingFlorida State University,23-24 February, 2005, Tallahassee, FloridaThe Maximum Likelihood Ensemble Filter (MLEF): An ensemble analysis/prediction system based on Control TheoryMilija ZupanskiCooperative Institute for Research in the AtmosphereColorado State UniversityFort Collins, CO 80523-1375ZupanskiM@CIRA.colostate.eduIn collaboration with: D. Zupanski, S. Fletcher, I.M. Navon, B. Uzunoglu, D. Randall, R. Heikes, D. Daescu, A. Hou, S. Zhang

  2. Outline • General problem • Maximum Likelihood Ensemble Filter • What’s next?

  3. General Problem • Theoretical issues • Single assimilation/prediction system • - complete feed-back between the assimilation and prediction • Universal mathematical form • - one algorithm applicable to any model, minimum or no dependency upon the modeled physical phenomenon • Nonlinearity • - real-life problems are nonlinear, need to know how to solve nonlinear problems • Non-differentiability • - methodology that works without differentiability requirement • Imperfect models • - need to know how to account for errors of the prediction model and observation operators – nothing is perfect!

  4. General Problem • Practical issues • High-dimensional systems ~O(106-108) • - real-life applications are highly-dimensional • - ultimate goal of probabilistic analysis/prediction system is to solve practical problems • - re-evaluate the feasibility of theoretical ideas • Algorithm development and maintenance (upgrade) • - models, data are constantly being developed and upgraded • - need a simple and effective system, capable of quick adjustment to the user needs • Numerical stability and robustness • - system has to work, even if limited information is available! • Computational issues • Disk storage, I/O, matrix-matrix and matrix-vector operations • Parallel computing – exploit development in computer science and technology

  5. General problem: Solution options • Major concern when looking for the solution: • - nonlinearity: prediction model, observation operator • (Option 1) • Introduce nonlinearities to the closed-form (linear) KF solution • - Ensemble Kalman Filters (EnKF) • (Option 2) • Directly solve the nonlinear problem using numerical solution methods • (i.e. iterative minimization) • - variational data assimilation, maximum likelihood ensemble filter • Issues • - Both approaches have well developed theory and practice for weakly nonlinear and differentiable problems • - (1) may lead to oversimplification of the general problem, higher-order moments • (2) needs good Hessian preconditioning, robust minimization

  6. Maximum Likelihood Ensemble Filter (MLEF) • A control theory application to ensemble ensemble data assimilation • Estimate the conditional mode of the posterior Probability Density Function (PDF) • Use minimization algorithms (C-G, LBFGS) to minimize the cost-function • Augmented control variable: initial conditions, model error and bias, empirical parameters, boundary conditions • Ensembles used to estimate the uncertainty of the conditional mode • Posterior error covariance calculated from minimization algorithm • Under linear and Gaussian assumptions,identical to EnKF square-root filters (e.g., Ensemble Transform Kalman Filter – ETKF, Bishop et al. 2001)

  7. Maximum Likelihood Ensemble Filter (MLEF) Forecast error covariance Minimize cost function Analysis error covariance • If h is linear, perfect Hessian preconditioning • If h is nonlinear, need xa~ true xmin to havereliable estimate of Pa • No sample error covariances: ensemble perturbations used to define the analysis increment subspace, not as random samples

  8. MLEF with Korteweg-de Vries-Burgers model Analysis Error Covariance Cycle No. 1 Cycle No. 4 Cycle No. 7 Cycle No. 10 j i • Initial error covariance noisy, but quickly becomes spatially localized • No need to force error covariance localization

  9. Model error in MLEF • State augmentation approach x0 – initial conditions ; b – model bias ; g – empirical parameters Augmented control variable: Augmented error covariance:

  10. MLEF with KdVB model Parameter estimation (diffusion coefficient) Error covariance block matrices IC-IC ME-ME IC-ME Significant cross-correlation between initial conditions and model error

  11. MLEF with NASA’s GEOS column model Assimilation of PSAS analyses • Work in progress under NASA’s TRMM project • D. Zupanski (CSU/CIRA) with A. Hou and Sara Zhang (NASA/GMAO) R1/2 = 1/2 e R1/2 = e Choice of observation errors directly impacts innovation statistics. Observation error covariance R is the only given input to the system!

  12. MLEF with CSU global shallow-water model Height analysis increment [xa-xb] Height RMS error [xa-xt] Impact of ensemble initialization: correlated initial ensemble perturbations can significantly improve algorithm performance Impact of error covariance localization: Dynamics has a positive impact on the smoothness and spatial localization of error covariance

  13. Information content of observations

  14. Use Bayes formula for multiple evidence [Yi – Evidence (observation type); X – Hypothesis (analysis)] How to exploit information content from specified observation types using ensembles? Denote: Maximum Likelihood Approach with multiple evidence:

  15. MLEF application to calculate information content RAMS model example – GOES-R project • Use Bayes formula for conditional probabilities with multiple observations • Separate observations in sub-groups • Calculate information content from each group Observation categories within the same cycle Multiple cycles • Initial cycles carry more information • Model still has a capability to learn from observations in later cycles

  16. Hessian Preconditioning in MLEF

  17. Hessian Preconditioning in MLEF

  18. Hessian Preconditioning in MLEF

More Related