1 / 35

A Third Paradigm for Decision Analysis

A Third Paradigm for Decision Analysis. Love Ekenberg. Dept. of Computer and Systems Sciences Stockholm University Royal Institute of Technology. Our Research. Analyses and implementation of decision theoretical aspects for artificial as well as for human decision makers.

zuzana
Download Presentation

A Third Paradigm for Decision Analysis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Third Paradigm for Decision Analysis Love Ekenberg Dept. of Computer and Systems Sciences Stockholm University Royal Institute of Technology

  2. Our Research • Analyses and implementation of decision theoretical aspects for artificial as well as for human decision makers. • Algorithms and methods for how decision theoretical and stochastical features can be integrated into specifications. • Methods for handling estimates in centralised and decentralised architectures. • Simulation and evaluation models for cathastrophic events.

  3. Computational Aspects • Conceptualisation of computationallly meaningful decision rules for decision support. • Monte Carlo simulations, statistical modelling, and integration of methods for data extraction. • Integration techniques for analyses of properties of distributions. • Development of fast algorithms for non linear optimization.

  4. Problems with Decision Analysis • PMEU is not a God-given selection rule • Value assignments are not easy even if they are mathematically nice to handle. • Our background information is often incomplete, i.e., probabilities are often not possible to assert. • The epistemological properties of weights, probabilities and values are unclear.

  5. Nevertheless… • An enormous problem is that precise value assignments and probability estimates are usually made anyway. • This is ABSURD and a cause of many bad decisions. • The results looks nice in presentations, but they are meaningless. • Let us take decisions seriously and see what can be done.

  6. Alternative Approaches for Handling Uncertainty • Possibility theory (Dubois and Prade, Cooman) • Capacities of order 2 (Choquet, Huber and Strassen, Denneberg) • Evidence theory and belief functions (Dempster, Shafer, Yager, Smets) • Various kinds of logic (Nilsson, Wilson)

  7. Alternative Approaches for Handling Uncertainty • Upper and lower probabilities (Smith, Hodges and Lehmann, Hurwicz, Wald, Kyburg, Weichselberger and Pöhlman, Malmnäs, Danielson and Ekenberg) • Sets of probability measures (Good, Levi) • Second-order probability theory (Gärdenfors and Sahlin, Ekenberg and Thorbiörnson)

  8. Alternative Approaches for Handling Uncertainty • fuzzy measures (capacities of order 1) (Bellman and Zadeh, Barrett, Chen and Hwang, Lai, Carlsson and Fuller, Triantaphyllou and Lin, Ribeiro, Cutello and Montero)

  9. Can We Relax? • Not yet! • Very few address the computational problems. • Useless for practical decision making.

  10. A Basic Approach to Real Life Decision Making Many decision problems can be seen as analysing multi-linear equations under linear constraints.

  11. A Computational Approach to Imprecise Decision Making • The probability of consequence c is greater than 25% • The value of consequence c is between 100 and 300 • Consequence c is preferred to consequence d

  12. Representation p11[0, 0.6], p12[0.3, 0.5], p13[0.1, 0.5] and p11 + p12 + p13 = 1

  13. Evaluation Given a decision problem, analyse k pik·vik –  k pjk·vjk, over all consequences in the two alternatives. This is just bilinear for demonstrational purposes, but already here the computational efforts are severe using standard optimization techniques.

  14. A Model

  15. Evaluation

  16. Evaluation

  17. Evaluation Thus, in many cases, second order effects must be taken into consideration, i.e., what is the strengths of beliefs in the various values.

  18. 2nd Order Information • By a belief distribution over B, we mean a positive distribution g defined on the unit cube B such that

  19. Depth in Decision Trees Let f1(x1),…, fm(xm) be uniform belief distributions over the intervals [0, 1]. The product hm(zm) over these m factors is the distribution

  20. Depth in Decision Trees

  21. 2nd Order Information

  22. 2nd Order Information • The range for A1 is 0 to 0.30 • The range for A2 is 0.02 to 0.28 • Traditional interval analysis cannot discriminate between them. • However, the main mass of A1 is between, say, 0.06 and 0.13 and the main mass of A2 is roughly between 0.16 and 0.24.

  23. Evaluation

  24. Result • Thus, in this case, there were no need for precise data. • This is often the case in real life decisions. • As usual, it is just to think a bit more.

  25. Some Real Life Cases • Storing spent nuclear waste in Sweden (SKN, Statens Kärnbränslenämnd) • Large purchasing decisions at the Swedish Rail Administration (Banverket, around 1 billion Euro) • Choice of effective orthopedic forms of treatment (The Swedish National Board of Health and Welfare)

  26. Some Real Life Cases • Model for risk management regarding evaluation of cost- and accessibility in cases of unplanned traffic disturbances (Swedish Telecom) • Choice of supporting system (Insurance company)Public-private flood insurance system for Hungary (IIASA and the Hungarian Academy of Sciences) • Land planning (Nacka Municipality)

  27. Conclusions • The human intuition is not an adequate tool for decision making and decision support is necessary. • Traditional decision support is insufficient. • The time consumption in real life analyses is often very high .

  28. Conclusions • We have completely solved this with our algorithms. • We can handle all explicit alternatives, criteria, consequences, weights, values and probabilities within the same framework. • The algorithms are implemented in a computer tool of commercial quality. • The experience of using this are very good.

  29. Ex Vascular Necrosis • A patient 68 years old is going to decide between amputation below the left knee or treatment with medicines only. • The risk of death when performing an amputation below the left knee is less than 1 %. • When using medicines only, the probability of to later be forced to performing an amputation above the knee is 20-30 %. In such an operation is the death risk 5-10 %.

  30. Consequences Alternative 1 • C11: Amputation below the left knee, fine otherwise • C12: Dead Alternative 2 • C21: Completely fine • N22: Amputation above the left knee • C221: Dead • C222: Fine with above knee

  31. Probabilities • C11: (Amputation below knee) >99% • C12: (Dead) <1% • C21: (Fine) 70-80% • N22: (Amputation above knee) 20-30% • C221: (Dead) 5-10% • C222: (Fine) 90-95%

  32. The Problem of Valuation The probabilities are in a sense ok, since we have statistically valid information. But how could we possibly quantify the values? Of course, we can assert values, but will they really be adequate?

  33. Finding a Realistic Model The consequences are practically impossible to value exactly, but they can resonable be ranked. • Fine is better than amputation below the knee. • Amputation below the knee is better than amputation above the knee. • Amputation above the knee is better than dead. • Transitive properties follows.

  34. Finding a Realistic Model Furthermore, we can assert that u(Fine) = 1 and that u(dead) = 0. This is just a matter of convention. The scale could be arbitrary. Compare to a temperatur measure like Celsius.

More Related