1 / 80

Andrea Saltelli, European Commission, Joint Research Centre Copenhagen, October 2007

Sensitivity Analysis An introduction. Andrea Saltelli, European Commission, Joint Research Centre Copenhagen, October 2007. Based on Global sensitivity analysis. Gauging the worth of scientific models John Wiley, 2007

eshana
Download Presentation

Andrea Saltelli, European Commission, Joint Research Centre Copenhagen, October 2007

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Sensitivity Analysis An introduction Andrea Saltelli, European Commission, Joint Research Centre Copenhagen, October 2007

  2. Based on Global sensitivity analysis. Gauging the worth of scientific models John Wiley, 2007 Andrea Saltelli, Marco Ratto, Terry Andres, Francesca Campolongo, Jessica Cariboni, Debora Gatelli , Michaela Saisana, Stefano Tarantola

  3. Outline Models – an unprecedented critique Definition of UA, SA Strategies Model quality Type I, II and III errors

  4. The critique of models and what sensitivity analysis has to do with it

  5. “They talk as if simulation were real-world data. They ‘re not. That ‘s a problem that has to be fixed. I favor a stamp: WARNING: COMPUTER SIMULATION – MAY BE ERRONEOUS and UNVERIFIABLE. Like on cigarettes […]”, p. 556

  6. For sure modelling is subject toady to an unprecedented critique. Have models fallen out of grace Is modelling just useless arithmetic as claimed by Pilkey and Pilkey-Jarvis 2007?

  7. Useless Arithmetic: Why Environmental Scientists Can't Predict the Future by Orrin H. Pilkey and Linda Pilkey-Jarvis Quantitative mathematical models used by policy makes and government administrators to form environmental policies are seriously flawed

  8. One of the examples discussed concerns the Yucca Mountain repository for radioactive waste disposal, where a very large model called TSPA (for total system performance assessment) is used to guarantee the safe containment of the waste. TSPA is Composed of 286 sub-models.

  9. TSPA (like any other model) relies on assumptions -- a crucial one being the low permeability of the geological formation and hence the long time needed for the water to percolate from the desert surface to the level of the underground disposal. The confidence of the stakeholders in TSPA was not helped when evidence was produced which could lead to an upward revision of 4 orders of magnitude of this parameter.

  10. We just can’t predict, concludes N. N. Taleb, and we are victims of the ludic fallacy, of delusion of uncertainty, and so on. Modelling is just another attempt to ‘Platonify’ reality… Nassim Nichola Taleb, The Black Swan, Penguin, London 2007

  11. You may disagree with the Tale’s and the Pilkey’s ... But is the cat is out of the bag. Stakeholders and media will tend to expect or suspect instrumental use of computation models, or mistreatment of their uncertainty.

  12. The critique of models The nature of models, after Rosen

  13. Decoding F Formal system N Natural system Entailment Entailment Encoding The critique of models After Robert Rosen, 1991, ”World” (the natural system) and “Model” (the formal system) are internally entailed - driven by a causal structure. [Efficient, material, final for ‘world’ – formal for ‘model’]Nothing entails with one another “World” and “Model”; the association is hence the result of a craftsmanship.

  14. The critique of models Since Galileo's times scientists have had to deal with the limited capacity of the human mind to create useful maps of ‘world’ into ‘model’. The emergence of ‘laws’ can be seen in this context as the painful process of simplification, separation and identification which leads to a model of uncharacteristic simplicity and beauty.

  15. The critique of models <<Groundwater models cannot be validated [!]>> Konikov and Bredehoeft, 1992. Konikov and Bredehoeft’s work was reviewed on Science in “Verification, Validation and Confirmation of numerical models in the earth sciences”, by Oreskes et al. 1994. Both papers focused on the impossibility of model validation.

  16. The (post modern) critique of models. The post-modern French thinker Jean Baudrillard (1990) presents 'simulation models' as unverifiable artefact which, used in the context of mass communication, produce a fictitious hyper reality that annihilates truth.

  17. Science for the post normal age is discussed in Funtowicz and Ravetz (1990, 1993, 1999) mostly in relation to Science for policy use. Jerry Ravetz Silvio Funtowicz

  18. The critique of models <-> Uncertainty <<I have proposed a form of organised sensitivity analysis that I call “global sensitivity analysis” in which a neighborhood of alternative assumptions is selected and the corresponding interval of inferences is identified. Conclusions are judged to be sturdy only if the neighborhood of assumptions is wide enough to be credible and the corresponding interval of inferences is narrow enough to be useful.>> Edward E. Leamer, 1990

  19. GIGO (Garbage In, Garbage Out) Science - where uncertainties in inputs must be suppressed lest outputs become indeterminate Jerry Ravetz

  20. The critique of models <-> Uncertainty Peter Kennedy, A Guide to Econometrics Anticipating criticism by applying sensitivity analysis. This is one of the ten commandments of applied econometrics according to Peter Kennedy: “Thou shall confess in the presence of sensitivity. Corollary: Thou shall anticipate criticism ’’

  21. The critique of models <-> Uncertainty When reporting a sensitivity analysis, researchers should explain fully their specification search so that the readers can judge for themselves how the results may have been affected. This is basically an `honesty is the best policy' approach, advocated by Leamer’.

  22. The critique of models - LAST! George Box, the industrial statistician, is credited with the quote ‘all models are wrong, some are useful’ Probably the first to say that was W. Edwards Deming. Box, G.E.P., Robustness in the strategy of scientific model building, in Robustness in Statistics, R.L. Launer and G.N. Wilkinson, Editors. 1979, Academic Press: New York. G. Box W. E. Deming

  23. Decoding F Formal system N Natural system Entailment Entailment Encoding The critique of models – Back to Rosen If modelling is a craftsmanship, then it can help the craftsman that the uncertainty in the inference (the substance of use for the decoding exercise) is apportioned to the uncertainty in the assumptions (encoding).

  24. Decoding F Formal system N Natural system Entailment Entailment Encoding Models – Uncertainty • ASSUMPTIONS <-> INFERENCES • ENCODING <-> DECODING • Apportioning inferences to assumptions is an ingredient of decoding – how can this be done?

  25. Definition. A possible definition of sensitivity analysis is the following: The study of how uncertainty in the output of a model (numerical or otherwise) can be apportioned to different sources of uncertainty in the model input. A related practice is `uncertainty analysis', which focuses rather on quantifying uncertainty in model output. Ideally, uncertainty and sensitivity analyses should be run in tandem, with uncertainty analysis preceding in current practice.

  26. Estimated parameters Input data Inference Model Models maps assumptions onto inferences … (Parametric bootstrap version of UA/SA ) (Estimation) (Parametric bootstrap: we sample from the posterior parameter probability) Uncertainty and sensitivity analysis

  27. About models. A model can be: • Diagnostic or prognostic. • - models used to understand a law and - models used to predict the behaviour of a system given a supposedly understood law. Models can thus range from wild speculations used to play what-if games (e.g. models for the existence of extraterrestrial intelligence) to models which can be considered accurate and trusted predictors of a system (e.g. a control system for a chemical plant).

  28. About models. A model can be: • 2. Data-driven or law-driven. • - A law-driven model tries to put together accepted laws which have been attributed to the system, in order to predict its behaviour. For example, we use Darcy's and Ficks' laws to understand the motion of a solute in water flowing through a porous medium. • - A data-driven model tries to treat the solute as a signal and to derive its properties statistically.

  29. More about law-driven versus data-driven Advocates of data-driven models like to point out that these can be built so as to be parsimonious, i.e. to describe reality with a minimum of adjustable parameters (Young, Parkinson 1996). Law-driven models, by contrast, are customarily over-parametrized, as they may include more relevant laws than the amount of available data would support.

  30. More about law-driven versus data-driven For the same reason, law-driven models may have a greater capacity to describe the system under unobserved circumstances, while data-driven models tend to adhere to the behaviour associated with the data used in their estimation. Statistical models (such as hierarchical or multilevel models) are another example of data-driven models.

  31. More categorizations of models are possible, e.g. • Bell D., Raiffa H., Tversky A. (eds.) (1988) – Decision making: Descriptive, normative and prescriptive interactions, Cambridge University press, Cambridge. • Formal models; axiomatic systems characterized by internal consistency. No need to have relations with the real world. • Descriptive models: these models are factual in the sense that their basic assumptions should be as close as possible with the real-world. • Normative models: they propose a series of rules that an agent should follow to reach specific objectives.

  32. Wikipedia’s entry for mathematical model A mathematical model is an abstract model that uses mathematical language to describe the behaviour of a system. Mathematical models are used particularly in the natural sciences and engineering disciplines (such as physics, biology, and electrical engineering) but also in the social sciences (such as economics, sociology and political science); physicists, engineers, computer scientists, and economists use mathematical models most extensively. Eykhoff (1974) defined a mathematical model as 'a representation of the essential aspects of an existing system (or a system to be constructed) which presents knowledge of that system in usable form'.

  33. Sample matrix for parametric bootstrap (ignoring the covariance structure). Each row is a sample trial for one model run. Each column is a sample of size N from the marginal distribution of the parameters as generated by the estimation procedure.

  34. Each row is the error-free result of the model run.

  35. Loop on boot-replica of the input data Estimation of parameters Inference Model Bootstrapping-of-the-modelling-process version of UA/SA, after Chatfield, 1995 (Model Identification) (Estimation) (Bootstrap of the modelling process)

  36. Data Prior of Model Prior of Model(s) Posterior of Model(s) Prior of Parameters Bayesian Model Averaging (Sampling) Inference Posterior of Parameters

  37. The analysis can thus be set up in various ways …what are the implications for model quality? What constitutes an input for the analysis depends upon how the analysis is set up … … and will instruct the modeller about those factors which have been included. A consequence of this is that the modeller will remain ignorant of the importance of those variables which have been kept fixed.

  38. The spectre of type III errors: “Assessing the wrong problem by incorrectly accepting the false meta-hypothesis that there is no difference between the boundaries of a problem, as defined by the analyst, and the actual boundaries of the problem” (Dunn, 1997). = answering the wrong question = framing issue (Peter Kennedy’s II commandment of applied econometrics: ‘Thou shall answer the right question’, Kennedy 2007) Dunn, W. N.: 1997, Cognitive Impairment and Social Problem Solving: Some Tests for Type III Errors in Policy Analysis, Graduate School of Public and International Affairs, University of Pittsburgh, Pittsburgh.

  39. The spectre of type III errors: Donald Rumsfeld version: "Reports that say that something hasn't happened are always interesting to me, because as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns -- the ones we don't know we don't know."

  40. In sensitivity analysis: Type I error: assessing as important a non important factor Type II: assessing as non important an important factor Type III: analysing the wrong problem

  41. Type III in sensitivity: Examples: • In the case of TSPA (Yucca mountain) a range of 0.02 to 1 millimetre per year was used for percolation of flux rate. Applying sensitivity analysis to TSPA could or could not identify this as a crucial factor, but this would be of scarce use if the value of the percolation flux were later found to be of the order of 3,000 millimetres per year. • Another example: The result of a model is found to depend on the mesh size employed.

  42. Our suggestions on useful requirements of a sensitivity analysis Requirement 1. Focus About the output Y of interest. The target of interest should not be the model output per se, but the question that the model has been called to answer.

  43. Requirement 1 - Focus Another implication Models must change as the question put to them changes. The optimality of a model must be weighted with respect to the task. According to Beck et al. 1997, a model is relevant when its input factors cause variation in the ‘answer’.

  44. Requirements Requirement 2. Multidimensional averaging. In a sensitivity analysis all known sources of uncertainty should be explored simultaneously, to ensure that the space of the input uncertainties is thoroughly explored and that possible interactions are captured by the analysis.

  45. Requirements Requirement 3. Important how?. Define unambiguously what you mean by ‘importance’ in relation to input factors / assumptions.

  46. Requirements Requirement 4. Pareto. Be quantitative. Quantify relative importance by exploiting factors’ unequal influence on the output. … Requirement N. Look at uncertainties before going public with findings.

  47. All models have use for sensitivity analysis … Atmospheric chemistry, transport emission modelling, fish population dynamics, composite indicators, risk of portfolios, oil basins models, macroeconomic modelling, radioactive waste management ...

  48. Prescription have been issued for sensitivity analysis of models when these used for policy analysis In Europe, the European Commission recommends sensitivity analysis in the context of the extended impact assessment guidelines and handbook (European Commission SEC(2005) 791 IMPACT ASSESSMENT GUIDELINES, 15 June 2005) http://ec.europa.eu/governance/docs/index_en.htm

  49. European Commission IMPACT ASSESSMENT GUIDELINES 2005) 13.5. Sensitivity analysis Sensitivity analysis explores how the outcomes or impacts of a course of action would change in response to variations in key parameters and their interactions. Useful techniques are presented in a book published by the JRC(*) […] Advantages • it is often the best way to handle the analysis of uncertainties.

  50. Sources: a multi-author book published in 2000. Methodology and applications by several practitioners. Chapter1, Introduction and 2, Hitch Hiker guide to sensitivity analysis offer a useful introduction to the topic

More Related