1 / 19

Week 9 Revision of Probabilities

Week 9 Revision of Probabilities. Value of Information Concepts Perfect Information Additional Information Reliability Bayes’ Theorem Imperfect Information Case Studies. Value of Information Concepts. EVPI. EVSI. Information. Perfect. Imperfect. No Add'l Info. EV(advance info).

mulan
Download Presentation

Week 9 Revision of Probabilities

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Week 9 Revision of Probabilities • Value of Information Concepts • Perfect Information • Additional Information Reliability • Bayes’ Theorem • Imperfect Information • Case Studies

  2. Value of Information Concepts EVPI EVSI Information Perfect Imperfect No Add'l Info EV(advance info) EV(Tree including test) EV(no add'l info) Note: Gross EVSI is calculated using EV of costless Test; Net EVSI is calculated using EV of Test including cost EVSI >= 0 EVPI > EVSI

  3. Walter’s Dog & Pony Show I • Walter’s Dog & Pony Show is scheduled to appear in Greely. Profits from the event depend on the weather - if it rains, Walter expects to lose $15000; if it is sunny, the show is expected to earn $10000 in profits. • According to the weather channel it has rained 25% of the time on the scheduled date for the show. • Use Decision Analysis to assist Walter in deciding whether or not to go ahead or cancel (at a guaranteed cost of $1000) the show.

  4. Walter's Dog & Pony Show I -1000 Cancel -15000 Rain .25 3750 3750 Go Sun Optimal Solution: 1) EV of tree = $3750 2) Strategy- “GO” .75 10000

  5. Value of Perfect Information -1000 Cancel -15000 Rain .25 3750 3750 Sun Go 10000 .75 EVPI = EV(knowing weather in advance)-EV(no additional info) = .75x$10000 + .25x(-$1000) - 3750 = 7250 - 3750 = 3500 = Willingness to pay for knowing weather in advance

  6. Walter’s Dog & Pony Show II • Prior to deciding to go or cancel, Walter is considering obtaining a forecast from the Valley Rainman. The Rainman has a reputation around these parts as a “weather smeller”. For the last 100 times that it has actually rained, Rainman had predicted rain 90 of those times. As well, he had correctly forecast sunny weather 80 times out of 100 actually sunny days. • Given a forecast for either rain or sun from the Rainman, and based on the Rainman’s forecasting ability to date, what should Walter decide about his show in Greely?

  7. Walter's Dog & Pony ShowDecision Tree Rain Cancel .25 Go Sun .75 Cancel Forecast Rain Rainy Go Sun Sunny Cancel Rain Go Sun

  8. Avalon Enterprises Decision Tree 130 Expire 430 Strike .55 Drill 250 Dry .45 115 30 259 Expire Seismic Test 415 Strike .85 Favorable Drill 355 .6 Dry 355 15 .15 259 Unfavorable Expire 115 .4 .1 415 115 Strike 55 Drill Dry .9 15

  9. Walter's Dog & Pony ShowProbability Assignments P{R} Cancel Rain .25 Go Sun P{S} .75 Cancel P{R|FR} Forecast Rainy Rain Go P{FR} Sun P{S|FR} Sunny Cancel P{R|FS} P{FS} Rain Go Sun P{S|FS}

  10. Walter’s Probability Tree Rain P{R|FR} Rainy P{S|FR} Sun Forecast P{FR} P{FS} Rain Sunny P{R|FS} P{S|FS} Sun

  11. Walter’s Flipped Probability Tree Rainy P{FR|R} Rain P{FS|R} Sunny Forecast P{R} P{S} Rainy Sun P{FR|S} P{FS|S} Sunny

  12. Walter’s Flipped Probability TreeProbability Calculations P{R and FR} Rainy P{FR|R}=.9 Rain P{FS|R}=.1 Sunny P{R and FS} P{R}=.25 P{S}=.75 Rainy P{S and FR} Sun P{FR|S}=.2 P{FS|S}=.8 Sunny P{S and FS}

  13. Flipped Probability Tree • Calculate probabilities for “joint events”: • P{R and FR}= P{FR|R}* P{R}= 0.9*0.25 = 0.225 • P{R and FS}= P{FS|R}* P{R}= 0.1*0.25 = 0.025 • P{S and FR}= P{FR|S}* P{S}= 0.2*0.75 = 0.15 • P{S and FS}= P{FS|S}* P{S}= 0.8*0.75 = 0.60 • 2) Calculate probabilities for missing “marginal events”: • P{FR}= P{R and FR}+ P{S and FR}= 0.225+0.15 = 0.375 • P{FS}= P{R and FS}+ P{S and FS}= 0.025+0.60 = 0.625

  14. Flipped Probability Tree 3) Calculate a posterior conditional probabilities needed for completing the decision tree formulation: P{R|FR} = P{R and FR}/P{FR} = 0.225/0.375 = 0.60 P{S|FR} = P{S and FR}/P{FR} = 0.15/0.375 = 0.40 P{R|FS} = P{R and FS}/P{FS} = 0.025/0.625 = 0.04 P{S|FS} = P{S and FS}/P{FS} = 0.60/0.625 = 0.96

  15. Bayes Theorem Derived Note that the calculation of the a posterior conditional probabilities can be written out in “long form” as: P{R|FR} = P{R and FR}/P{FR} = 0.225/0.375 = 0.60 = P{FR|R}*P{R} P{R and FR} + P{S and FR} = P{FR|R}*P{R} P{FR|R}*P{R} + P{FR|S}*P{S} This is Bayes’ Theorem for updating and revising probablities!

  16. Walter's Dog & Pony ShowDecision Tree Solution -1000 Cancel -15000 Rain .25 Go 3750 Sun .75 -1000 10000 5250 Cancel Forecast -15000 Rain .6 Rainy Go -1000 .375 Sun -5000 10000 .4 5250 Sunny Cancel -1000 .625 .04 -15000 9000 Rain 9000 Go Sun .96 10000

  17. Value of Information Concepts EVPI EVSI Information Perfect Imperfect No Add'l Info EV(advance info) EV(Tree including test) EV(no add'l info) Note: Gross EVSI is calculated using EV of costless Test; Net EVSI is calculated using EV of Test including cost EVSI >= 0 EVPI > EVSI

  18. Walter's Dog & Pony ShowDecision Tree Solution -1000 Cancel -15000 Rain .25 Go 3750 Sun .75 -1000 10000 5250 Cancel Forecast -15000 Rain .6 Rainy Go -1000 EVSI = EV(tree including Forecast) - EV(tree with no additional information) = 5250 - 3750 = 1500 = Willingness to pay Rainman .375 Sun -5000 10000 .4 5250 Sunny Cancel -1000 .625 .04 -15000 9000 Rain 9000 Go Sun .96 10000

  19. Key Concepts • How to determine value of information? • Value Added measures • EVPI and EVSI • How to calculate revised probabilities? • Probability trees and flipped trees • Application of Bayes Theorem • Additive and Multiplicative Laws of Probability

More Related