Judgment and Decision Making in Information Systems
This presentation is the property of its rightful owner.
Sponsored Links
1 / 26

Yuval Shahar, M.D., Ph.D. PowerPoint PPT Presentation


  • 76 Views
  • Uploaded on
  • Presentation posted in: General

Judgment and Decision Making in Information Systems Decision Making, Sensitivity Analysis, and the Value of Information. Yuval Shahar, M.D., Ph.D. Personal Decision Making: The Party Problem. Joseph K. invites his friends to a party, but needs to decide on the location:

Download Presentation

Yuval Shahar, M.D., Ph.D.

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Yuval shahar m d ph d

Judgment and Decision Making in Information SystemsDecision Making, Sensitivity Analysis, and the Value of Information

Yuval Shahar, M.D., Ph.D.


Personal decision making the party problem

Personal Decision Making:The Party Problem

  • Joseph K. invites his friends to a party, but needs to decide on the location:

    • Outdoors (O) on the grass (completely open)

    • On the Porch (P) (covered above, open on sides)

    • Inside (I) the living room

  • If the weather is sunny (S), outdoors is best, followed by Porch and Indoors; if it rains (R), the living room is best, followed by Porch and Outdoors


The party problem an ordered preferences list

The Party Problem:An Ordered Preferences List

O-S (best)

P-S

I-R

I-S

P-R

O-R (worst)


Evaluating the prospects

B

W

Evaluating The Prospects

1

0

O-S (best)

P-S

I-R

I-S

P-R

O-R (worst)

0.95

B

0.05

W

0.67

B

0.33

W

0.57

B

0.43

W

0.32

B

0.68

W

0

B

1

W


The party problem adding probabilities

The Party Problem: Adding Probabilities

S 0.4

R 0.6

O

S 0.4

P

R 0.6

I

S 0.4

R 0.6


The party problem substituting prospects

The Party Problem: Substituting Prospects

1

S 0.4

B

0

W

0

B

R 0.6

O

1

W

0.95

S 0.4

B

0.05

W

P

0.32

B

R 0.6

0.68

W

0.57

I

S 0.4

B

0.43

W

0.67

R 0.6

B

0.33

W


The party problem simplifying the relationship to preference probabilities

B

B

B

W

W

W

The Party Problem: Simplifying The Relationship to Preference Probabilities

0.4

0.6

O

0.57

P

0.43

I

0.63

0.37


The party problem using the choice rule

B

B

B

W

W

W

The Party Problem: Using the Choice Rule

0.4

0.6

O

0.57

P

0.43

I

0.63

<=

0.37


Expected preference probability values

Expected Preference-Probability Values

Preference Probability

S 0.4

1

E-value =0.40

R 0.6

0

O

S 0.4

0.95

E-value= 0.57

P

R 0.6

0.32

I

S 0.4

E-value= 0.63

0.57

R 0.6

0.67


Computation with decision trees making the decision

Computation With Decision Trees: Making the Decision

  • Decision trees are “folded back” to the top most (leftmost, or initial) decision

  • Computation is performed by averaging expected utility (in this case, preference probability) recursively over tree branches from right to left (bottom up), maximizing utility for every decision made and assuming that this is the expected utility for the subtree that follows the computed decision


The value of information vi

The Value of Information (VI)

  • We often need to decide what would be the next best piece of information to gather (e.g., within a diagnostic process); that is, what is the best next question to ask (e.g., what would be the result of a urine culture?);

  • Alternatively, we might need to decide how much, if at all, is a paticular additional piece of information worth

  • The Value of Information (VI) of feature f is the marginal expected utility of an optimal decision made knowing f, compared to making it without knowing f

  • The net value of information (NVI) of f = VI(f)-cost(f)

  • NVI is highly useful when deciding what would be the next information item, if any, to investigate


Computing the value of information requirements

Computing the Value of Information: Requirements

  • Decision Makers are often faced with the option of getting additional information for a certain cost

  • To assess the value of additional information to the decision maker, we need a continuous, real valued utility measure, such as money

  • Thus, we need to map preference probabilities to a monetary scale or an equivalent one (such as time)


Introducing monetary values bringing in the wizard

Introducing Monetary Values: Bringing in The Wizard

  • We ask the decision maker for her willingness to pay for a change from any state to any other state, assuming we have a wizard who can perform the change instantaneously

  • We can thus create a 1:1 correspondence between preference probabilities and $ values


The party problem adding monetary values

The Party Problem:Adding Monetary Values

Prospect Preference Probability $ Value

O-S (best)1 100

P-S0.95 90

I-R0.67 50

I-S0.57 40

P-R 0.32 20

O-R (worst) 00


Joseph k s utility curve

Joseph K.’s Utility Curve

Utility

1

0.5

0

+

$0 $34 $50K $100K

Money

Note: Once we know JK’s utilility curve, we can compute his certain equivalent for ANY deal, e.g., a (<$100K, 50%; $0, 50%> deal, which happens to be $34, using graphical or other methods; or the $ certain equivalent of any outcome (e.g., Outdoor location)


The value of clairvoyance

The Value of Clairvoyance

  • To compute the value of information, we assume a clairvoyant who knows with certainty all outcomes and always tells the truth

  • However, clairvoyance has a price!

  • Thus, the question is, How Much should we pay the clairvoyant for her forecast (e.g., for telling Joseph K. if it will rain or not)


Computing the value of clairvoyance

Computing the Value of Clairvoyance

  • We build a decision tree comparing the optimal decision without clairvoyance to the optimal decision given that the clairvoyant prophesies any of the possible outcomes (in this case, with 100% accuracy)

  • We need to deduct the cost of clairvoyance from the $ value of all outcomes and role the decision tree back as we did before, to determine the expected value of the decision given clairvoyance (and paying for it!)

  • We compute e-values using u-values (which represent the decision maker’s utility for each $ value)

  • At the end (root node) we convert u-values to $ values

  • We then know whether the clairvoyance is worth its cost


Computing the value of clairvoyance1

O

P

I

Computing The $ Value of Clairvoyance

$value U value

U= 0.40  $26

0.4 S

  • 1

  • 0 0

  • 0.95

  • 0.32

  • 0.57

  • 50 0.67

O

0.6 R

U= 0.57  $40

0.4 S

P

U= 0.63  $46

0.6 R

No clairvoyance

U=0.63 ( $46)

0.4 S

I

0.6 R

U= 0.63  $46

$value $value-$cost

1 S

$85

  • 85

  • 0 -15

  • 75

  • 5

  • 25

  • 50 35

  • 85

  • 0 -15

  • 75

  • 5

  • 25

  • 50 35

0 R

1 S

U= 0.92  $85

$75

“S”

0 R

1 S

$25

0.4

Buy clairvoyance for $15

U =0.67 ( $51)

0 R

$-15

“R”

0 S

O

1 R

$ 5

0.6

P

0 S

I

U= 0.50 $35

0 S

$35

1 R


The value of partial clairvoyance

The Value of Partial Clairvoyance

  • Assume we have a rain detector which, when the weather is going to be sunny or rainy, predicts Rain or Sun 80% of the time, correspondingly; is it worth $10 to use it?

  • In order to compute the value of an uncertain piece of information, we first need to calculate the probability for rain or sun, given each corresponding prediction of the detector

  • This involves reversing the tree in which the detector information is usually given, using Bayes theorem


Computing the value of partial clairvoyance i representing the detector s profile

Computing The Value of Partial Clairvoyance (I): Representing the Detector’s Profile

0.8 “S”

0.32

0.4 S

0.2 “R”

0.08

0.2 “S”

0.12

0.6 R

0.8 “R”

0.48


Computing the value of partial clairvoyance ii calculating accuracy by reversing the tree

Computing The Value of Partial Clairvoyance (II): Calculating Accuracy by Reversing the Tree

0.727 S

0.32

0.44 “S”

0.273 R

0.08

0.143 S

0.12

0.56 “R”

0.857 R

0.48


Yuval shahar m d ph d

Computing The Value of Partial Clairvoyance (III): Calculating the Optimal Decision With and Without Clairvoyance

  • To actually calculate the value of uncertain information, we compare the expected utility of the best decision without the information to the expected utility of the decision with the information, as we did before

  • We use the distribution of real world states given the semi-clairvoyant’s prediction (using our accuracy calculation)

  • In this particular case (80% accuracy), the value without detector is U = 0.627, or $45.83; the value with partial information is U = 0.615, or $44.65

  • Thus, for Joseph K., this particular detector is not worth the money


Sensitivity analysis

Sensitivity Analysis

  • The main insights into a decision are often given by an analysis of the influence of the given probabilities and utilities on the final decision and its value

  • We thus get a sense as to how sensitive the expected utility and optimal decisions are to each parameter, and can focus our attention (and perhaps, further elicitation efforts) on the most important aspects of the problem


The party problem sensitivity analysis

The Party Problem: Sensitivity Analysis

$value U value

S p

100

1

R 1-p

0

0

O

S p

U = p

90

0.950

P

R 1-p

20

0.323

U = 0.323+0.627p

I

S p

40

0.568

U = 0.667-0.099p

R 1-p

50

0.667


Sensitivity analysis of the party problem

Sensitivity Analysis of the Party Problem

The value of information

$

U

100

Free clairvoyance:

0.667+0.333p

1

90

0.8

I: 0.667+0.0.099p

0.6

40

0.4

20

P: 0.323+0.628p

0.2

O:p

0.2

0.4

0.47

0.6

0.8

P


Tornado diagrams

Tornado Diagrams

  • We can calculate the lower and upper bounds on the expected utility E(U) of the optimal decision for the full range of changes in each parameter Xi while leaving all other parameters at their nominal (default) value

  • We get a quick sense of the potential influence of each parameter on the expected utility

  • But:

    • It ignores the change distribution

    • It ignores the dependencies between parameters

X4

X12

X3

X5

X7

X8

- 0 +


  • Login