1 / 28

Prof. David R. Fox

An Info-gap Approach to Modelling Risk and Uncertainty in Bio-surveillance having Imperfect Detection rates. Prof. David R. Fox. Acknowledgements: Prof. Yakov Ben-Haim (Technion, Israel) Prof. Colin Thompson (University of Melbourne). Risk versus Uncertainty. Risk.

fredrickaj
Download Presentation

Prof. David R. Fox

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. An Info-gap Approach to Modelling Risk and Uncertainty in Bio-surveillance having Imperfect Detection rates Prof. David R. Fox

  2. Acknowledgements: • Prof. Yakov Ben-Haim (Technion, Israel) • Prof. Colin Thompson (University of Melbourne)

  3. Risk versus Uncertainty Risk • risk = hazard x exposure or • risk = likelihood x consequence • Duckworth (1998): • is a qualitative term • cannot be measured • is not synonymous with probability • “to ‘take a risk’ is to allow or cause exposure to the danger” • is the chance, within a specified time frame, of an adverse event with specific (negative) consequences

  4. Risk versus Uncertainty The AS4360:1999 Risk Matrix CONSEQUENCE LIKELIHOOD

  5. Risk • Development and adoption of a ‘standard’ risk metric seems a long way off (never?); • Bayesian methods are becoming increasingly popular, although acceptance may be hampered by biases and lack of understanding; • More attention needs to be given to appropriate statistical modelling. In particular: • model choice • Parameter estimation • Distributional assumptions • ‘Outlier’ detection and treatment • robust alternatives (GLMs, GAMs, smoothers etc).

  6. Uncertainty • Severe uncertainty → almost no knowledge about likelihood • Arises from: • Ignorance • Incomplete understanding • Changing conditions • Surprises • Is ignorance probabilistic? • Ignorance is not probabilistic – it is an info-gap

  7. Shackle-Popper Indeterminism • Intelligence • What people know, influences how they behave • Discovery • What will be discovered tomorrow cannot be known today • Indeterminism • Tomorrow’s behaviour cannot be modelled completely today

  8. Knightian Uncertainty • Frank Knight • Nov 7 1885 – Apr 15 1972 • Economist • Author (Risk, Uncertainty and Profit) • Knightian Uncertainty • Differentiates between risk and uncertainty • → unknown outcomes and known probability distributions • Different to situations where pdf of a random outcome is known

  9. Dealing with Uncertainties • Strategies • Worst-case • Max-Min (utility) • Min-Max (loss) • Maximize expected utility • Pareto optimization • “Expert” opinion • Bayesian approaches • Info-Gap

  10. Info-Gap Theory (Ben-Haim 2006) • Is a quantitative, non-probabilistic approach to modelling true Knightian uncertainty; • Seeks to optimize robustness / immunity to failure or opportunity of windfall; • Contrasts with classical decision theory which typically seeks to maximize expected utility; • An info-gap is the difference between what is known and what needs to be known in order to make a reliable and responsible decision.

  11. Components of an Info-Gap Model • Uncertainty Model • Consists of nominal values of unknowns and an horizon of uncertainty • Performance requirement • Inequalities expressed in terms of unknowns • Robustness Criterion • Is the largest for which the performance requirements in (2) are met realisations of unknowns in the uncertainty model (1) • ‘Unknowns’ can be probabilities of adverse outcome

  12. Pernicious Uncertainty Propitious Robustness and Opportuneness

  13. Robustness and Opportuneness • Robustness (immunity to failure) • is the greatest horizon of uncertainty at which failure cannot occur • Opportuneness (immunity to windfall gain ) • is the least level of uncertainty which guarantees sweeping success Note: robustness/opportuneness requires optimisation but not of the performance criterion.

  14. Robust satisficing vs direct optimization • Alternatives to optimization: • Pareto improvement – an alternative ‘solution’ which leaves one individual better off without making anyone else worse off. • Pareto optimal – when no further Pareto improvements can be made • Principle of good enough – where quick and simple preferred to elaborate • Satisficing (Herbert Simon, 1955) – to achieve some minimum level of performance without necessarily optimizing it.

  15. Robust satisficing

  16. Robust satisficing

  17. Fractional Error Models ~ • Best estimate of uncertain function U(x) is U(x) • Although fractional error of this estimate is unknown • The unbounded family of nested sets of functions is a fractional-error info-gap model:

  18. IG Models : Basic Axioms All IG models obey 2 basic axioms: • Nesting • Contraction i.e when horizon of uncertainty is zero, the estimate is correct

  19. An IG application to bio-surveillance • Thompson (unpublished) examined the general sampling problem associated with inspecting a random sample of n items (containers, flights, people, etc.) from a finite population of N using an info-gap approach. • The info-gap formulation of the problem permitted the identification of a sample size n such that probability of adverse outcome did not exceed a nominal threshold, when severe uncertainty about this probability existed. • Implicit in this formulation was the assumption that the detection probability (ie. the probability of detecting a weapon, adverse event, anomalous behaviour etc.) once having observed or inspected the relevant item / event / behaviour was unity.

  20. Surveillance with Imperfect Detection

  21. Surveillance with Imperfect Detection Arguably, the more important probability is and not Define:

  22. Surveillance with Imperfect Detection Can show (see paper), that: For 100% inspections: Furthermore:

  23. Surveillance with Imperfect Detection Performance criterion: i.e.

  24. Surveillance with Imperfect Detection Fractional error model: Robustness function:

  25. Surveillance with Imperfect Detection Example • Dept. of Homeland Security intelligence → attack on aircraft imminent • Nature / mode of attack unknown • All estimates (detection prob., prob. of attack etc.) subject to extreme uncertainty.

  26. Surveillance with Imperfect Detection

  27. Surveillance with Imperfect Detection Comparison with a Bayesian Approach

  28. Surveillance with Imperfect Detection

More Related