1 / 25

Structuring Decision Trees

Structuring Decision Trees. Iteration. Structure. Deterministic Analysis. Probabilistic Analysis. Appraisal. Decision. Initial Situation. 1. Draw Decision Tree. 1. Draw Decision Tree. 3. Evaluate Decision Tree. 100. 50. .5. Expected Values Probability Distributions. 0. .5.

jamuna
Download Presentation

Structuring Decision Trees

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Structuring Decision Trees

  2. Iteration Structure Deterministic Analysis Probabilistic Analysis Appraisal Decision InitialSituation 1. Draw Decision Tree 1. Draw Decision Tree 3. Evaluate Decision Tree 100 50 .5 • Expected Values • Probability Distributions 0 .5 25 2. Assess Probabilities 4. Analyze Sensitivity to Probabilities • Needs for Refinement Value .25 .50 p .25 The probabilistic analysis phase begins by drawing the decision tree. 2.09 • Structuring Decision Trees

  3. Choosing Which Factors to Include • Constructing the Tree We will discuss guidelines for constructing a decision tree with sequential decisions. 2.09 • Structuring Decision Trees

  4. Decision Uncertainty Commercial Value Net Value Success $1,000 MM $800 MM Invest in Development –$200 MM Failure $0 –$200 MM Don’t Invest $0 $0 Time Uncertainty Decision A decision tree organizes and displays important factors of a decision in chronological sequence. 2.09 • Structuring Decision Trees

  5. Structure Deterministic Analysis Probabilistic Analysis Appraisal Decision InitialSituation Iteration Deterministic Sensitivity Decision Tree Critical Uncertainties Strategy Table Decision Alternatives Results from earlier phases provide the decisions and uncertainties for the tree. 2.09 • Structuring Decision Trees

  6. Sensitivity analysis provides insight into the key drivers of risk and value for each alternative. Commercial Value Given Development Success Description Base Case = $ 1690 MM Median 0 400 800 1200 1600 2000 2400 2800 3200 3600 “Me Too”: Same Efficacy, Price War Blockbuster: 50% Better Efficacy, Premium Price Better Competitive Efficacy Profile The top variables represent the key drivers of future value 8 24 20 Market Share for Target Profile (%) Low GDP, Strong Governmental Price Pressure High GDP, Weak Governmental Price Pressure –2.0 1.0 –0.5 Annual Real Price Change (%) 1,600 2,000 1,800 Market Size (M Therapy Days) 40 10 25 Price Drop on Product Patent Expiry (%) Flat Manufacturing Learning Curve 0.16 0.08 0.13 Variable COGS ($/Therapy Day) Steep Manufacturing Learning Curve 75 50 60 Market Share Loss at Product Patent Expiry (%) 2005 2003 2004 Launch Date Same Significantly Better Better Safety Profile Relative to Major Competitor Many Additional Uncertainties 2.09 • Structuring Decision Trees

  7. Market Share Market Size Price drop at LOE Endpoint Value 8% 1600 -40% 5 .25 .25 .40 Value from spreadsheet 18% .30 .50 .50 -25% 1800 2 3 4 .30 .25 .25 24% 2000 -10% 6 $0 This schematic decision tree illustrates key decisions and uncertainties in a Life Sciences investment. Phase III Development Gold Standard Program –$6M Bare Bones Program –$50M 1 No Further Investment 2.09 • Structuring Decision Trees

  8. We will discuss guidelines for constructing a decision tree with sequential decisions. • Choosing Which Factors to Include • Constructing the Tree 2.09 • Structuring Decision Trees

  9. Add probabilities, outcomes, and values. 5 Connect the tree branches. 4 Draw a schematic tree. 3 Number the nodes in chronological order. 2 List the decisions and uncertainties. 1 Here is an easy five-step procedure to construct the tree. We will demonstrate using the Ursa Minor Movies case. 2.09 • Structuring Decision Trees

  10. Ursa Minor Movies (UMM) seems to be able to produce only two kinds of films: box office blockbusters or disasters. UMM conducts pre-release “sneak previews” for some of its films. These tests label a film as a “Hit” or a “Dud,” but they are expensive ($100,000) and sometimes unreliable. After a sneak preview, UMM can decide whether or not to release the film. UMM has kept track of the ultimate box office success or failure of 50 films that were previewed. UMM asserts that these data are representative of the current market for its films. Sneak Preview Results Office Success After Release Blockbusters “Hit” 20 5 “Dud” 10 15 Disasters URSA MINOR MOVIES UMM is considering whether or not to conduct a sneak preview for its most recent film. Because Ursa has the right to release, cancel, or modify the film after the preview—this is called a “real option.” Ursa will realize a $20,000,000 net profit if this film is a blockbuster and suffer a $5,000,000 net loss if it is a disaster. If UMM doesn’t release this film, it will have a net loss of $1,000,000. (None of these figures include the sneak preview cost, but they do include the costs for film production and, when applicable, the release costs and revenues.) Assignment Draw and evaluate the decision tree for UMM’s decision on whether to purchase the sneak preview option. Begin by listing decisions and uncertainties and the measure for evaluating endpoints. Next, draw a “schematic” tree with nodes representing decisions and uncertainties placed in chronological order. Assign numbers in the tree and then evaluate the sneak preview decision. Assume that the film’s chances of success are no different from the 50 films that were previewed. If you have trouble computing probabilities for the tree, use these. P(“Hit”) = 0.5 P(Blockbuster) = 0.6 P(Blockbuster|“Hit”) = 0.8* P(Disaster|“Dud”) = 0.6 * This conditional probability is read: “The probability of a blockbuster, given that the sneak preview said ‘hit.’” 2.09 • Structuring Decision Trees

  11. Steps 1 and 2: List the decisions and uncertainties and number the nodes in chronological order. Decisions Uncertainties Decision Criterion Buy Sneak- Preview Option Exercise Film- Release Option 2 Profit 1 Test Result Box Office 3 4 2.09 • Structuring Decision Trees

  12. Step 3: Draw a schematic tree to illustrate the order of decisions and uncertainties. Buy Sneak-Preview Option Exercise Film-Release Option Test Result Box Office Test “Hit” Yes Blockbuster No “Dud” No Disaster Uncertainty Decision 2.09 • Structuring Decision Trees

  13. Step 4: Connect the tree branches. Buy Sneak-Preview Option Exercise Film-Release Option Preview Result Box Office Blockbuster Yes Disaster “Hit” No Test Blockbuster Yes Disaster “Dud” No Blockbuster Yes Disaster No No 2.09 • Structuring Decision Trees

  14. Step 5: Add the probabilities, outcomes, and values to complete the tree. Buy Sneak-Preview Option Exercise Film-Release Option Preview Result Box Office Profit ($ millions) Blockbuster 20 –5 –1 Yes .8 Disaster .2 “Hit” .50 No Test Blockbuster 20 –5 –1 Yes .4 –$0.1 million Disaster .6 .50 “Dud” No Blockbuster 20 –5 –1 Yes .6 Disaster .4 No No 2.09 • Structuring Decision Trees

  15. Buy Sneak- Preview Option Test Result Exercise Film-Release Option Box Office Blockbuster Yes .8 Disaster .2 “Hit” .5 No Test Blockbuster Yes .4 –$0.1 million Disaster .6 .5 “Dud” No Assessed Order Chronological Order Box Office Test Result Test Result Box Office “Hit” Blockbuster .8 .667 Blockbuster “Hit” “Dud” Disaster .333 .2 .6 .5 “Hit” Blockbuster .4 .5 .4 .25 Disaster “Dud” “Dud” Disaster .6 .75 “Bayesian Revision” is used to reverse the order. Decisions and uncertainties must be chronological in the tree, but uncertainties can be assessed in any order. 2.09 • Structuring Decision Trees

  16. Blockbuster Yes Disaster “Hit” No Test Blockbuster Yes Disaster “Dud” No Blockbuster Yes Disaster No No At “mid-tree” decisions, uncertainties to the left are resolved; those to the right are unresolved. Buy Sneak-Preview Option Exercise Film-Release Option Preview Result Box Office Resolved Unresolved 2.09 • Structuring Decision Trees

  17. Use the right-to-left rollback procedure to compute expected values and evaluate the tree. Buy Sneak-Preview Option Exercise Film-Release Option Preview Result Box Office Profit ($ millions) 15 Blockbuster 20 –5 –1 Expected Value = $10M 15 Yes .8 Disaster .2 “Hit” 10 .5 No 9.9 Test 5 Blockbuster 20 –5 –1 5 Yes .4 –$0.1 M Disaster .6 .5 “Dud” No 10 Blockbuster 20 –5 –1 10 Yes .6 Disaster .4 No No The sneak-preview option is worthless. 2.09 • Structuring Decision Trees

  18. We have discussed systematic techniques for structuring decision trees. • Choosing Which Factors to Include • Constructing the Tree 2.09 • Structuring Decision Trees

  19. Appendix 2.09 • Structuring Decision Trees

  20. What do you do when the tree is too big? • Choosing Which Factors to Include • Constructing the Tree • Pruning the Tree • Evaluating Subtrees p 2.09 • Structuring Decision Trees

  21. “Overgrown” trees obscure insights and take too long to evaluate, even by computer. * Based on model recalculation time of 0.02 seconds. Number of 3-Branch Nodes 2 4 6 8 10 12 Number of Paths (endpoints) 9 81 729 6,561 59,049 531,441 Evaluation Time* 0.2 sec 1.6 sec 14 sec 2½ min 19 min 3 hrs 2.09 • Structuring Decision Trees

  22. Therefore, pruning the tree is essential. • Use three branches, at most, for uncertainties. • Use a maximum of five or six nodes. • Eliminate unimportant uncertainties based on deterministic sensitivity analysis. • Combine uncertainties using “scenarios,” if necessary. • Simplify the deterministic model to reduce run time. • Create subtrees. A well-pruned tree often generates the greatest insight. 2.09 • Structuring Decision Trees

  23. Creating “subtrees” may be an effective way to prune the tree. • Choosing Which Factors to Include • Constructing the Tree • Pruning the Tree • Evaluating Subtrees p 2.09 • Structuring Decision Trees

  24. Subtrees are often helpful in consolidating several issues into one variable in the final tree. Examples of Consolidated Variables EPA Ruling R&D Target Yes Success No Failure FDA Approval Litigation Yes Sued No Not 2.09 • Structuring Decision Trees

  25. Device Required? Yes .02 No Yes .20 Yes .06 No .14 As an example, a plant may be required to install an emissions control device. President Makes Acid Rain a Priority * Environmental Protection Agency. Congress Legislates EPA* Rules Device Required? Probability Yes Yes .54 .9 Yes Yes Yes .04 .6 .1 .67 No .84 No No .33 Yes .16 .5 .4 No Yes .5 No .3 No .7 2.09 • Structuring Decision Trees

More Related