1 / 67

Towards a Decision-Centric Framework for Uncertainty Propagation and Data Assimilation

Towards a Decision-Centric Framework for Uncertainty Propagation and Data Assimilation. Gabriel A. Terejanu Doctoral Dissertation Defense Department of Computer Science & Engineering University at Buffalo Committee Dr. Peter D. Scott (Chair) Dr. Jason J. Corso Dr. Puneet Singla.

terentia
Download Presentation

Towards a Decision-Centric Framework for Uncertainty Propagation and Data Assimilation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Towards a Decision-Centric Framework for Uncertainty Propagation and Data Assimilation Gabriel A. Terejanu Doctoral Dissertation Defense Department of Computer Science & Engineering University at Buffalo Committee Dr. Peter D. Scott (Chair) Dr. Jason J. Corso Dr. PuneetSingla

  2. Introduction and Motivation for a Decision Centric Framework for Uncertainty Propagation Aleatory Uncertainty Propagation based on an Adaptive Gaussian Mixture Model Decision Centric Resource Allocation for Gaussian Mixture Models Hybrid Aleatory-Epistemic Uncertainty Propagation Conclusions and Future Work

  3. Motivation Chernobyl disaster – Ukraine, April 26th 1986 Highly radioactive fallout plume drifted over large parts of Europe and Soviet Union. Chemical Biological Radiological Nuclear • Decision Makers • require accurate and relevant predictions of toxic cloud evolution in order to • deploy emergency responders • evacuate cities • shelter population • cache medical gear Source: Wikipedia.org

  4. Example Mathematical Model Mathematical Model: puff dispersion model • Process Model • Initial Conditions • Measurement Model Source: SCIPUFF + Dipole Pride 26 [Terejanu08] INTEGRITY (Intrinsic) - Producer RELEVANCE (Extrinsic) - Consumer Inaccurate & Unreliable information is largely irrelevant to the decision maker “like beauty, what is truly information is largely in the eyes of the beholder” [Endsley01]

  5. PROPOSAL Decision-Centric Framework FORECAST PRODUCER FORECAST CONSUMER/USER General Use Algorithms Specific Use INTEGRITY RELEVANCE Reconcile the two views into a Decision-Centric Framework which provides both a more accurate and relevant approximation to the uncertainty propagation.

  6. Types of Uncertainty Stochastic Difference Equations (for Discrete-time dynamics) Stochastic Differential Equations (for Continuous-time dynamics) Aleatory Uncertainty: characterized by randomness with known probability distributions. Discrete-time measurement model Uncertain initial condition (example) Epistemic Uncertainty: used to model ignorance. Is the parameter in discussion random or not? If it is random what is the underlying probability distribution ? [Ferson96] Hybrid Behavior = Aleatory Uncertainty + Epistemic Uncertainty

  7. Example Uncertainty Propagation • Model parameters are all known – only aleatory uncertainty • Model parameters are uncertain (epistemic uncertainty) • infinite sets of pdfs

  8. 1 Global View REALITY Fundamental Laws Qualitative Model based on sciences (no equations) Mathematical Model (no values for parameters) System Sensors Numerical Solution Model Conceptualization Verification 2 Noisy observations LABORATORY Output Calibrated Computational Model (estimated parameters and distributions) Validation System PRODUCER CONSUMER Sensors Calibration Verification and Validation The focus of the present dissertation 3 Approximate Uncertainty Propagation REALITY Noisy observations Validated Model System Data Assimilation Experts (parameters and initial conditions) Sensors Uncertainty Propagation and Data Assimilation 4 Approximate Probability Density Function Decision Maker Expected Utility Theory Utility Function Actions Decision Making

  9. Proposed View Decision – Centric 3 Aleatory Uncertainty (eq. process noise, measurement noise) REALITY Novel aleatory uncertainty propagation algorithm using Gaussian Mixture Models Implementation of a parallel Particle Filter for Data Assimilation Application in a CBRN incident with Real Data Novel adaptive Gaussian Sum Filters for both discrete-time and continuous-time dynamical Derivations of the Unscented Kalman Smoother, both two-filter form and the RTS form, using the Weighted Statistical Linearization Novel decision-centric computational resource allocation algorithm for uncertainty propagation Novel epistemic uncertainty propagation using Polynomial Chaos expansion and the Bernstein Form Novel propagation of both epistemic and aleatory uncertainty through dynamical systems with stochastic forcing System Sensors Surrogate Model (eq. linearized model) Decision Sensitive Approximate Uncertainty Propagation Validated Model Noisy observations Data Assimilation Epistemic Uncertainty (parameters and initial conditions) Uncertainty Propagation Algorithm Decision-Centric Uncertainty Propagation and Data Assimilation Resource Allocation Mixed Aleatory and Epistemic Approximate Uncertainty Decision Making – Uncertainty Propagation Interaction Level 4 Ignorance Level Utility Function Constructed Probability Density Function & Ignorance Function Ignorance Averse Expected Utility Theory Decision Maker Utility Function Actions Decision Making

  10. Introduction and Motivation for a Decision Centric Framework for Uncertainty Propagation Aleatory Uncertainty Propagation based on an Adaptive Gaussian Mixture Model Decision Centric Resource Allocation for Gaussian Mixture Models Hybrid Aleatory-Epistemic Uncertainty Propagation Conclusions and Future Work

  11. Exact Solution • For continuous-time dynamical systems: Fokker-Planck-Kolmogorovequation (FPKE) • Analytical solution exists only for stationary pdfs and they are restricted to a limited class of dynamical systems • For discrete-time dynamical systems: Chapman-Kolmogorov equation (CKE) • Difficult to solve and approximate in general nonlinear case • Positivity constraint • Normalization constraint • No fixed Solution Domain

  12. Uncertainty Propagation Conventional Methods • Gaussian Closure Approach • Assume the underlying pdf to be Gaussian • Statistical linearization, Stochastic averaging • Monte Carlo approximation • Sample initial distribution • Propagate individual points through the exact nonlinear dynamics • Study the statistics of the propagated points

  13. Gaussian Mixture Models • With sufficient number of Gaussian components, any continuous pdf may be approximated as closely as desired FPKE Linearized propagation True pdf + + = Component 1 w1 Component 2 w2 Component 3 w3 Approximation w1 + w2 + w3 = 1 • Assumption:covariances are small enough such that the linearizations become representative for the dynamics around the means [Sorenson71] • Easily violated: strong nonlinearities, computational constraints

  14. Adaptive Gaussian Mixture Model • Pdf approximated by a finite sum of Gaussian densities • Exact solution given by FPKE • We want our Gaussian mixture to satisfy the FPKE • Compute FPKE error • Where:

  15. Adaptive Gaussian Mixture Model cont. • Approximation – first forward difference FPKE Update • The error is linear in the new weights • New weights obtained by minimizing

  16. Numerical Example • Quintic Oscillator • 11 Gaussian components • Coordinates of the means linearly spaced between [-5,-5] x [5,5] • Equal covariance matrices • Equally weighted • Propagation: 1000 sec • Stationary pdf:

  17. Numerical Example cont. Conventional Exact Solution Adaptive GMM

  18. Nonlinear Filtering for Data Assimilation Measurement Process noise noise True State Physical Physical Sensor Process Process Initial Guess of Measurement Process Model - the State Model Corrector (Bayes rule) • Filtering is the problem of finding the probability density function (pdf) of the states of a system at the current time given all the observations available up to the current time Reality Abstract

  19. Extended Kalman Filter initial pdf forecast pdf posterior pdf EKF Propagation Step EKF Measurement Update

  20. Classic Gaussian Sum Filter 0.3 0.3 0.3 0.5 0.5 0.3 0.2 0.2 0.4 =

  21. Adaptive Gaussian Sum Filter 0.3 0.5 0.4 0.3 0.5 0.2 0.3 0.3 0.2 ≠ Update I: Continuous-time dynamical systems Updates the weights by constraining the Gaussian sum approximation to satisfy the Fokker-Planck equation Update II: Discrete-time nonlinear systems Weights to minimize the integral square difference between the true forecast pdf and its Gaussian sum approximation

  22. Numerical Example • Lorenz system [Lorenz31] • Performance Measures

  23. Numerical Results

  24. Part I: Concluding Remarks • Two update schemes for the forecast weights are presented in order to obtain a better Gaussian sum approximation to the forecast pdf. • Continuous Dynamical Systems: minimize the FPKE error • Discrete Dynamical Systems: minimize the integral square difference between the true pdf and its approximation • Both methods result in a strictlyconvex quadratic programming problems - guaranteed to have a unique solution. • Functional representation of the pdf • Computational Complexity: • Useful in pure propagation, low SNR, low frequency of measurements, limited information provided by the measurement model (eq. quadratic model).

  25. But … … even by improving the propagation method, chances are that we may still get a poor approximation to the forecast probability distribution where it matters in the decision making process. Eq. in the tails of the distribution. Why? Because we still have a finite representation.

  26. Introduction and Motivation for a Decision Centric Framework for Uncertainty Propagation Aleatory Uncertainty Propagation based on an Adaptive Gaussian Mixture Model Decision Centric Resource Allocation for Gaussian Mixture Models Hybrid Aleatory-Epistemic Uncertainty Propagation Conclusions and Future Work

  27. Conventional Methodology… NO evacuation action Decision Maker CONSUMER PRODUCER 0.5 0.2 0.3 Uncertainty Propagation (Approximations) params 0.4 0.5 0.1 10,000 residents

  28. Decision-Centric Approach EVACUATE action Decision Maker 10,000 residents Interaction Level (PSGC) 0.5 0.2 0.2 0.1 Uncertainty Propagation (Approximations) 0.0 params 0.4 0.5 0.1 0.0 10,000 residents

  29. Decision-Centricvs Conventional Exact Solution Loss function Expected Loss FPKE Conventional Decision-centric Lin.prop. +weig.upd Lin.prop. +weig.upd

  30. Progressive Selection of Gaussian Components Initial PDF • Sample the means of the new Gaussian components • and their covariance • Propagate moments until the decision time • Inflation coefficient based on the most distant component and inflate the loss function Td • New weights • (regression) Ti • Copy the weights

  31. Progressive Selection of Gaussian Components Initial PDF • Sample the means of the new Gaussian components • and their covariance • Propagate moments until the decision time • Inflation coefficient based on the most distant component and inflate the loss function Td • New weights • (regression) Ti • Copy the weights

  32. Progressive Selection of Gaussian Components Initial PDF • Sample the means of the new Gaussian components • and their covariance • Propagate moments until the decision time • Inflation coefficient based on the most distant component and inflate the loss function Td • New weights • (regression) Ti • Copy the weights

  33. Progressive Selection of Gaussian Components Initial PDF • Sample the means of the new Gaussian components • and their covariance • Propagate moments until the decision time • Inflation coefficient based on the most distant component and inflate the loss function Td • New weights • (regression) Ti • Copy the weights

  34. Progressive Selection of Gaussian Components Initial PDF • Sample the means of the new Gaussian components • and their covariance • Propagate moments until the decision time • Inflation coefficient based on the most distant component and inflate the loss function Td • New weights • (regression) Ti • Copy the weights

  35. Progressive Selection of Gaussian Components Initial PDF • Sample the means of the new Gaussian components • and their covariance • Propagate moments until the decision time • Inflation coefficient based on the most distant component and inflate the loss function Td • New weights • (regression) Ti • Copy the weights

  36. Progressive Selection of Gaussian Components Initial PDF • Sample the means of the new Gaussian components • and their covariance • Propagate moments until the decision time • Inflation coefficient based on the most distant component and inflate the loss function Td • New weights • (regression) Ti • Copy the weights

  37. Progressive Selection of Gaussian Components Initial PDF • Sample the means of the new Gaussian components • and their covariance • Propagate moments until the decision time • Inflation coefficient based on the most distant component and inflate the loss function • New weights • (regression) Convergence – Evolution of the proposal pdf used to sample the means of the new Gaussian components • Copy the weights

  38. Example TRUTH Uncertainty propagation: Loss function: EKF Decision time: Td = 8 sec Expected loss:

  39. Results

  40. CBRN scenario

  41. Classic Reference Decision Centric

  42. Part II: Concluding Remarks Interaction Level between decision maker and the prediction module – incorporate contextual information held by the decision maker Progressive Selection of Gaussian Components to supplement the initial uncertainty with new Gaussian components that are sensitive to the loss function at the decision time. The cost of the overall improvement is an increase in the number of Gaussian components. Significantly enhanced accuracy within the decision maker’s region of interest. The new probability density function addresses the region of interest and provides a better approximation overall, if any probability density mass is moving naturally towards the region of interest.

  43. But … … the methods provided increase the accuracy and the relevance of the propagated probability density function when the prior, process noise and the model parameters are precisely known. This is also known as the Bayesian dogma of precision [Walley91]. These precise values are difficult to obtain in practice, due to the amount of information available, incomplete knowledge of the system or systematic underestimation of uncertainty which arises in the elicitation process.

  44. Introduction and Motivation for a Decision Centric Framework for Uncertainty Propagation Aleatory Uncertainty Propagation based on an Adaptive Gaussian Mixture Model Decision Centric Resource Allocation for Gaussian Mixture Models Hybrid Aleatory-Epistemic Uncertainty Propagation Conclusions and Future Work

  45. Interval information [Ferson07] Plus-or-minus sign uncertainty reports Significant digits Intermittent measurements Non-detects / right censoring Data binning and rounding Missing data Gross ignorance

  46. Theory of Evidence • Primitive function: the basic probability assignment (bpa) [Shafer76] • Frame of discernment • Power set • Focal element • Normalized belief structures

  47. Theory of Evidence • Belief function, or the lower bound (total amount of support) • Plausibility function, or the upper bound (total amount of potential) • The precise probability • Degree of ignorance

  48. Theory of Evidence • Pignistic transformationused once decisions have to be made [Smets90] 0.1 0.4 0.5 0.25 0.25 0.1 0.2 0.2 0.1 0.25 0.45 0.2 • Dempster’s rule of combination of two independent arguments

  49. Objective Epistemic Uncertainty Aleatory Uncertainty Propagate both epistemic and aleatory uncertainty separately, and combine them only when a decision has to be made. Propose a two-level hierarchical framework. Dempster Shafer structure Level 2: Epistemic Uncertainty Level 1: Aleatory Uncertainty Precise probability distributions Dempster Shafer structures Certainty Total ignorance Lose information Assume more than known

  50. DS structures on closed intervals • Probability boxes and DS structures on closed intervals [Ferson03] 1 4 m([1,4])=2/3 6 3 m([3,6])=1/3 Precise Probability Interval Uncertainty 0% NIDI = 60% 100%

More Related