1 / 14

Regularization for Hypothesis Testing and Exploration of Predictive Uncertainty

Regularization for Hypothesis Testing and Exploration of Predictive Uncertainty. Stability Achieved Through Parsimony. The World: Variable Unknown Little information The Modeler: Tired Stressed Budget constrained Would like some personal life……. Hypothetical model.

tyrone
Download Presentation

Regularization for Hypothesis Testing and Exploration of Predictive Uncertainty

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Regularization for Hypothesis Testing and Exploration of Predictive Uncertainty

  2. Stability Achieved Through Parsimony • The World: • Variable • Unknown • Little information • The Modeler: • Tired • Stressed • Budget constrained • Would like some personal life…… Hypothetical model

  3. Stability Achieved Through Parsimony The Approach: • Small(ish) number of zones • Based on geology • Satisfies conceptual model • Some are fixed or tied • Some have prior information Stable and often quite rapid in terms of the regression since sensitivities obtained quickly K2 K7 K6 K1 K5 K4 K3 K8 K1 = 30 K6 = 40 K1 = 30; weight = 15.0 K6 = 40; weight = 15.0 log(K1) - log(K2) = 0 log(K1) - log(K3) = 0

  4. Stability Achieved Through Parsimony The Difficulty: • How much lumping is best? • What if the lumped zones contradict the true hydrogeologic framework? • What if the ‘mean’ in my zone is not what I need to know? • What if there is uncertainty in the knowledge of my parameters – my prior information?

  5. Pilot Points Estimate Values at Points: • Parameter values are defined at the pilot points • Can be aquifer parameters, recharge, etc • Location selected on variety of information

  6. Pilot Points Interpolate from Points to Grid: • Interpolation using kriging or other methods • Kriging has some advantages – e.g., Gaussian. Extreme values are always at the pilot points, which can be a good or bad thing.

  7. Pilot Points Spatially Varying Properties: • Can be used in conjunction with zones • Can have different interpolation and prior information in different zones

  8. Pilot Points and Prior Information Prior Information on Pilot Points: • Prior information on value or between pilot points • In the latter - points linked to each other by prior information equations. Used to tend toward a smooth distribution. P2 P3 P1 log(P1) - log(P2) = 0 log(P1) - log(P3) = 0

  9. Pilot Points and Regularization Prior Information and Regularization: • Inter-parameter weights can be predetermined – equal, f(distance), etc • Can be based on same geostatistics as kriging • Can be determined through constrained optimization – Tikhonov Regularization P2 P3 P1 log(P1) - log(P2) = 0 Weight = ? log(P1) - log(P3) = 0 Weight = ?

  10. Regularized Inversion Pilot Points and Regularization: • Supports the representation of smoothly-varying aquifer properties, which is consistent with many geologic contexts. For example, deltaic deposits. • Enables modeler to set a ‘target objective function’ – how well to fit the data. Typically - greater departures from smooth produces better fit. • Can be placed throughout the domain, focused where data (aquifer tests, water levels) are densest: • Leads to good fits • Suggests this is the only place where heterogeneity reigns? • When used in predictive analysis – can illustrate the mechanism that leads to a very ‘bad’ or very ‘good’ simulated outcome.

  11. Regularized Inversion(Doherty,2003, Ground Water) Study Area Model Domain and Observed Heads Model Grid

  12. Regularized Inversion Study Area RMS = 3.5 inches RMS = 2 inches

  13. Large Numbers of Parameters Can, however, lead to: • Numerical instability • Non-unique results • Long run times • Heightened anxiety, loss of personal life …

  14. Eigenvector Analysis(Tonkin and Doherty, 2005, WRR) Eigenvector Analysis • Rather than estimating every parameter individually, estimate combinations of parameters • This is related to what we do when we a-priori ‘tie’ parameters – that is: • We define parameter combinations by analyzing parameters sensitivities and observation weights • But – this is accomplished formally through eigenvector analysis (related to principal component analysis) • We call these combinations of parameters – “super parameters”!

More Related