1 / 59

CHAPTER 6 (handout)

CHAPTER 6 (handout). Decision Trees. 6.1. Introduction. Sequential decision making sequence of chance-dependent decisions presentation of analysis can be complex Decision Trees Pictorial device to represent problem & calculations

rianna
Download Presentation

CHAPTER 6 (handout)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CHAPTER 6 (handout) Decision Trees

  2. 6.1. Introduction Sequential decision making • sequence of chance-dependent decisions • presentation of analysis can be complex Decision Trees • Pictorial device to represent problem & calculations • Useful for problems with small no. of sequential decisions

  3. 6.3. Another Decision Tree Ex. 2 boxes, externally identical Must decide which box • a1: box 1: 6 black balls, 4 white balls • a2: box 2: 8 black balls, 2 white balls • Correct guess Receive $100 • Wrong guess Receive $0 Prior Probability • P(1) = 0.5 • P(2) = 0.5

  4. Decision Tree • A connected set of nodes & arcs • Nodes: join arcs • Arcs: have direction (L to R) • Branch: arc & all elements that follow it • 2 branches from same initial node cannot have elements in common • 2 nodes cannot be joined by > 1 arc

  5. Example of a Decision Tree

  6. A diagram which is not a tree

  7. Types of nodes • Decision point • Choosing next action (branch) • Chance node • Uncontrollable probabilistic event • Terminal node • Specifies final payoff

  8. Example of Sequential Decision Problem Car Exchange Problem A person must decide whether to keep or exchange his car in a showroom. There are 2 decisions: a1: keepcost = 1400 SR a2: exchange, has 2 possibilities: • good buy P(G) = 0.6 cost = 1200 SR • bad buy P(B) = 0.4 cost = 1600 SR Good or bad buy can be identified only after buying and using the car. What he should do to minimize his cost?

  9. Car Exchange Problem (no information) Payoff (Cost) Matrix  P() a1: keep a2: exchange 1: Good 0.6 1400 1200 2: Bad 0.4 1400 1600 EV 1400 1360

  10. Car exchange decision tree $1400 G: 0.6 Keep B: 0.4 $1400 $1200 G: 0.6 Exchange B: 0.4 $1600

  11. Car exchange decision tree $1400 G: 0.6 $1400 Keep B: 0.4 $1400 $1200 G: 0.6 $1360 Exchange B: 0.4 $1600

  12. 6.2. A Sequential Test Problem Car Exchange Problem Assume the person has 5 options for deciding whether to keep or exchange his car. • Decide without extra information • Decide on basis of free road (driving) test • Decide after oil consumption test costing $25 • Decide after combined road/oil test costing $10 • Decide sequentially: road test then possibly oil test costing $10 In (iv), both tests must be taken In (v), oil test is optional, depending on road test

  13. Car Exchange Problem (with information) • The decision tree is complicated • Cannot fit in 1 slide • 5 branches: 5 options • Probabilities after extra information are conditional (posterior) • To illustrate, we choose the branch of option (v) • Road test then, depending on result, possible oil test costing $10

  14. Car Exchange Problem (with information) Result of road test: • y1 : fair p(y1) = 0.5 • y2 : poor p(y2) = 0.5 Result of oil consumption test: • Z1 : high p(Z1|y) • Z2 : medium p(Z2 |y) • Z3 : low p(Z3 |y)

  15. Car exchange decision tree (with information) No test Z1 Z2 y1: 0.5 Oil test Z3 Road test No test Z1 y2: 0.5 Z2 Oil test Z3

  16. Car exchange decision tree with information (y1 branch) 1400 0.6 0.4 1400 a1 1200 a2 0.6 0.4 1600 1410 0.43 0.57 1410 a1 1210 a2 0.43 No test 0.57 1610 Z1: 0.28 1410 0.5 0.5 1410 a1 Oil test Z2: 0.24 1210 a2 0.5 0.5 1610 y1: 0.5 1410 0.75 0.25 1410 a1 Z3: 0.48 1210 a2 0.75 0.25 1610

  17. Car exchange decision tree with information (y1 branch) 1400 1400 0.6 0.4 1400 a1 1360 1200 a2 1360 0.6 0.4 1600 1410 1410 0.43 0.57 1410 a1 1410 No test 1210 a2 1439 0.43 0.57 1610 Z1: 0.28 1410 1410 0.5 0.5 1410 a1 1362 1410 Oil test Z2: 0.24 1210 a2 1410 0.5 0.5 1610 y1: 0.5 1410 1410 0.75 0.25 1410 a1 Z3: 0.48 1310 1210 a2 1310 0.75 0.25 1610

  18. Car exchange decision tree with information (y2 branch) 1400 0.6 0.4 1400 a1 No test 1200 y2: 0.5 a2 0.4 0.6 1600 1410 0.25 0. 75 1410 a1 1210 a2 0.25 0.75 Oil test 1610 Z1: 0.32 1410 0.31 0.69 1410 a1 Z2: 0.26 1210 a2 0.31 0.69 1610 1410 0.57 0.43 1410 a1 Z3: 0.42 1210 a2 0.57 0.43 1610

  19. Car exchange decision tree with information (y2 branch) 1400 1400 0.6 0.4 1400 a1 1400 No test 1200 y2: 0.5 a2 1440 0.4 0.6 1600 1410 1410 0.25 0. 75 1410 a1 1410 1210 a2 1510 0.25 0.75 Oil test 1610 Z1: 0.32 1410 1410 0.31 0.69 1410 a1 1398 1410 Z2: 0.26 1210 a2 1487 0.31 0.69 1610 1410 1410 0.57 0.43 1410 a1 Z3: 0.42 1381 1210 a2 1381 0.57 0.43 1610

  20. Decision Tree Calculations • Tree is developed from left to right • Calculations are made from right to left • Many calculation are redundant • For inferior solutions • Not needed in final solution • Probabilities after extra information (road or oil tests) are conditional (posterior) • Calculated by Bayes’ theorem

  21. Initial Payoff Data (no information) Payoff (Reward) Matrix  P() a1: Box 1 a2: Box 2 1: Box 1 0.5 100 0 2: Box 2 0.5 0 100 EV 50 50

  22. Initial Probability Data (no information) Prior Probability Matrix  P() B: Black W: White 1: Box 1 0.5 0.6 0.4 2: Box 2 0.5 0.8 0.2

  23. Decision tree without information $100 1 : 0.5 $50 Box 1 2: 0.5 $0 $0 1: 0.5 $50 Box 2 2: 0.5 $100

  24. Decision Tree Example with information • Samples from box can be taken • Ball is returned to the box • Up to 2 samples are allowed • Cost = $3 per sample • What is the optimal plan?

  25. Posterior probabilities for sample 1 Probability Calculations  P() P(B) P(W) JointPosterior 1: 0.5 0.6 0.4 0.3 0.2 0.43 0.67 2: 0.5 0.8 0.2 0.4 0.1 0.57 0.33  1.0 0.7 0.3 1.00 1.00

  26. Decision tree with information a1 or a2 No information $50 No sample No sample $ B: 0.7 Sample 2 $ Sample 1 No sample $ W: 0.3 Sample 2 $

  27. Posterior probabilities for sample 2when sample 1 is Black Probability Calculations  P() P(B) P(W) JointPosterior 1: 0.43 0.6 0.4 0.26 0.17 0.36 0.61 2: 0.57 0.8 0.2 0.46 0.11 0.64 0.39  1.0 0.72 0.28 1.00 1.00

  28. Sample 1 Black, No Sample 2 $97 1: 0.43 40 a1 2: 0.57 $-3 54 No 2nd sample $-3 1: 0.43 54 Black sample 1 a2 2: 0.57 $97 B: 0.72 Sample 2 $ W: 0.28 $

  29. Samples 1 & 2 Both Black $94 1: 0.36 30 a1 2: 0.64 $-6 58 Black sample 2 $-6 1: 0.36 58 a2 B: 0.72 2: 0.64 $94 Sample 2 Black sample 1 W: 0.28 $ No Sample $54

  30. Sample 1 Black, Sample 2 White $94 1: 0.61 55 a1 2: 0.39 $-6 55 White sample 2 $-6 W: 0.28 1: 0.61 33 a2 2: 0.39 57.16 $94 Sample 2 Black sample 1 B: 0.72 $58 No Sample $54

  31. Posterior probabilities for sample 2when sample 1 is White Probability Calculations  P() P(B) P(W) JointPosterior 1: 0.67 0.6 0.4 0.40 0.27 0.61 0.79 2: 0.33 0.8 0.2 0.26 0.07 0.39 0.21  1.0 0.66 0.34 1.00 1.00

  32. Sample 1 White, No Sample 2 $97 1: 0.67 64 a1 2: 0.33 $-3 64 No 2nd sample $-3 1: 0.67 30 White sample 1 a2 2: 0.33 $97 B: 0.66 Sample 2 $ W: 0.34 $

  33. Sample 1 White, Sample 2 Black $94 1: 0.61 55 a1 2: 0.39 $-6 55 Black sample 2 $-6 B: 0.66 1: 0.61 33 a2 2: 0.39 $94 Sample 2 White sample 1 W: 0.34 $ No Sample $64

  34. Samples 1 & 2 Both White $94 1: 0.79 73 a1 2: 0.21 $-6 73 White sample 2 $-6 1: 0.79 W: 0.34 15 a2 2: 0.21 61.12 $94 Sample 2 White sample 1 B: 0.66 $55 No Sample $64

  35. Decision tree summary of results a1 or a2 No information $50 No samples 57.2 a2 No 2nd sample $54 B: 0.7 Sample 2 57.2 B, 0.72: a2 59.2 $58 Sample 1 W, 0.28: a1 $55 64 W: 0.3 a1 No 2nd sample $64 61.1 Sample 2 B, 0.66: a1 a1: 6B, 4W a2: 8B, 2W $55 $73 W, 0.34: a1

  36. Decision Tree with Fixed Costs • Example of fixed cost: • sampling cost = 3/sample in previous example • If objective is to maximize expected payoff, • Constant costs can be deducted either from: • Terminal node payoffs • Expected values

  37. Example: Including fixed costs Recall Slide 9 $100 1: 0.43 43 – 3 a1 Sample 1 Black, cost = $3 $0 2: 0.57 $97 1: 0.43 40 a1 Sample 1 Black, cost = $3 $– 3 2: 0.57

  38. Fixed Costs & Utilities • Utilities can be used instead of payoffs • If objective is to maximize expected utility • Constant costs must be deducted from terminal node payoffs • Net payoffs are converted to net utilities • Expected values are taken of utilities of net payoffs

  39. Including fixed costs U(100) Incorrect 1: 0.43 EU–U(3) a1 Sample 1 Black, cost = $3 U(0) 2: 0.57 U(97) Correct 1: 0.43 EU a1 Sample 1 Black, cost = $3 U(– 3) 2: 0.57

  40. Allowing an optional 3rd sample • Suppose now a 3rd sample is allowed • Sample cost = $3 • Assume the decision whether or not to take sample 3 depends on results of samples 1 and 2 • What is the optimal plan?

  41. Posterior probabilities for sample 3 After 2 blacks (slide 8)  P() P(B) P(W) JointPosterior 1: 0.36 0.6 0.4 0.22 0.14 0. 3 0.52 2: 0.64 0.8 0.2 0.51 0.13 0. 7 0.48  1.0 0.73 0.27 1.00 1.00

  42. Decision tree with optional sample 3 No sample $50 Sample 1 B: 0.7 No 2nd sample $54 Sample 2 No 3rd sample $57.2 Sample 3 $ W: 0.3 No 2nd sample $64 Sample 2 No 3rd sample $61.1 Sample 3 $

  43. Fixing the number of samples • Suppose now a 3rd sample is allowed • Sample cost = $3 • Assume we must decide the number of samples in advance: 0, 1, 2, or 3 • What is the optimal plan?

  44. Zero samples $100 1 : 0.5 $50 a1: Box 1 2: 0.5 $0 50 No samples $0 1: 0.5 $50 a2: Box 2 2: 0.5 $100

  45. One Sample $97 1: 0.43 40 a1 2: 0.57 $-3 54 $-3 1: 0.43 B: 0.7 54 a2 2: 0.57 $97 57 Sample once $97 1: 0.67 64 a1 2: 0.33 $-3 W: 0.3 64 $-3 1: 0.67 30 a2 2: 0.33 $97

  46. Posterior probabilities for 2 samples Examples: P(BB|1) = P(BB) = 0.6(0.6) = 0.36 P(BW|1) = P(BW) + P(WB) = 0.6*0.4 + 0.4*0.6 = 0.48 P(WW|1) = P(WW) = 0.4(0.4) = 0.16  P() BB BW WWJoint 1: 0.5 0.36 0.48 0.16 0.18 0.24 0.08 2: 0.5 0.64 0.32 0.04 0.32 0.16 0.02  0.50 0.40 0.10 Post 1: 0.36 0.60 0.80 2: 0.64 0.40 0.20

  47. Two Samples 30 1: 0.36 $94 a1 2: 0.64 $-6 58 58 1: 0.36 $-6 a2 2: 0.64 $94 BB: 0.5 54 1: 0.6 $94 a1 2: 0.4 58 $-6 BW: 0.4 54 Sample twice 34 1: 0.6 $-6 a2 2: 0.4 $94 WW: 0.1 74 1: 0.8 $94 a1 2: 0.2 $-6 74 14 1: 0.8 $-6 a2 2: 0.2 $94

  48. Posterior probabilities for 3 samples P(BBB|1) = 0.6(0.6)(0.6) = 0.216 P(BBW|1) = P(BBW) + P(BWB) + P(WBB) =3*0.6*0.6*0.4= 0.432 P(BWW|1) = P(BWW) + P(WBW) + P(WWB) =3*0.6*0.4*0.4= 0.288 P(WWW|1) = 0.4(0.4)(0.4) = 0.064  P BBB BBW BWW WWWJoint 1:0.5 0.216 0.432 0.288 0.064 0.108 0.216 0.144 0.032 2:0.5 0.512 0.384 0.096 0.008 0. 256 0.192 0.048 0.004  0.364 0.408 0.192 0.036 Post 1: 0.30 0.53 0.75 0.89 2: 0.70 0.47 0.25 0.11

  49. 21 1: 0.3 $91 Three Samples a1 2: 0.7 61 $-9 61 1: 0.3 $-9 2: 0.7 a2 $91 BBB: 0.36 44 1: 0.53 $91 a1 2: 0.47 44 $-9 BBW: 0.41 38 1: 0.53 $-9 55.7 2: 0.47 Sample 3 times a2 $91 BWW: 0.19 66 1: 0.75 $91 a1 2: 0.25 66 $-9 16 1: 0.75 $-9 2: 0.25 a2 $91 WWW: 0.04 80 1: 0.89 $91 a1 2: 0.11 80 $-9 2 1: 0.89 $-9 2: 0.11 a2 $91

  50. Summary of results with fixed number of samples $50 0 samples 1 sample $57 2 samples $58 3 Samples $55.7

More Related