1 / 40

Chapter 8 Revising Judgments in the Light of New Information

Chapter 8 Revising Judgments in the Light of New Information. In this chapter we will look at the process of revising initial probability estimates in the light of new information. Bayes ’ theorem. Prior probability. New information. Posterior probability.

keels
Download Presentation

Chapter 8 Revising Judgments in the Light of New Information

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 8 Revising Judgments in the Light of New Information

  2. In this chapter we will look at the process of revising initial probability estimates in the light of new information.

  3. Bayes’ theorem Prior probability New information Posterior probability

  4. The components problem (Fig. 8.1)

  5. In total, we would expect 410 (i.e. 140 + 270) components to fail the test. Now the component you selected is one of these 410 components. Of these, only 140 are 'OK7, so your posterior probability that the component is 'OK7 should be 140/410, which is 0.341, i.e. P(component OK|failed test) = 140/410 = 0.341

  6. Applying Bayes’ theorem to the components problem (Fig. 8.2)

  7. The steps in the process which we have just applied are summarized below: (1) Construct a tree with branches representing all the possible events which can occur and write the prior probabilities for these events on the branches. (2) Extend the tree by attaching to each branch a new branch which represents the new information which you have obtained. On each branch write the conditiona1 probability of obtaining this information given the circumstance represented by the preceding branch. (3) Obtain the joint probabilities by multiplying each prior probability by the conditional probability which follows it on the tree. (4) Sum the joint probabilities. (5) Divide the 'appropriate' joint probability by the sum of the joint probabilities to obtain the required posterior probability.

  8. Example: An engineer makes a cursory inspection of a piece of equipment and estimates that there is a 75% chance that it is running at peak efficiency. He then receives a report that the operating temperature of the machine is exceeding 80° C. Past records of operating performance suggest that there is only a 0.3 probability of this temperature being exceeded when the machine is working at peak efficiency. The probability of the temperature being exceeded if the machine is not working at peak efficiency is 0.8. What should be the engineer's revised probability that the machine is operating at peak efficiency? Refer to Fig. 8.3

  9. Another example(more than two events ) • A company's sales manager estimates that there is a 0.2 probability that sales in the coming year will be high, a 0.7 probability that they will be medium and a 0.1 probability that they will be low. She then receives a sales forecast from her assistant and the forecast suggests that sales will be high. By examining the track record of the assistant's forecasts she is able to obtain the following probabilities:

  10. p(high sales forecast given that the market will generate high sales) = 0.9 p(high sales forecast given that the market will generate only medium sales) =0.6 p(high sales forecast given that the market will generate only low sales) = 0.3 Refer to Fig. 8.4

  11. We obtain the following posterior probabilities: p(high sales) = 0.18/0.63 = 0.2857 P(medium sales) = 0.42/0.63 = 0.6667 p(low sales) = 0.03/0.63 = 0.0476

  12. The effect of new information on the revision of probability judgments • It is interesting to explore the relative influence which prior probabilities and new information have on the resulting posterior probabilities. • Consider that a situation where the geologist is not very confident about his prior probabilities and where the test drilling is very reliable.

  13. Vague priors and very reliable information

  14. The posterior probabilities depend only upon the reliability of the new information. The 'vague' prior probabilities have had no influence on the result.

  15. A more general view of the relationship between the 'vagueness' of the prior probabilities and the reliability of the new information can be seen in Figure 8.6.

  16. The effect of the reliability of information on the modification of prior probabilities (+)

  17. If the test drilling has only a 50% probability of giving a correct result then its result will not be of any interest and the posterior probability will equal the prior, as shown by the diagonal line on the graph.

  18. The more reliable the new information, the greater will be the modification of the prior probabilities. For any given level of reliability, however, this modification is relatively small either where the prior probability is high, or where the prior probability is very small.

  19. At the extreme, if your prior probability of an event occurring is zero then the posterior probability will also be zero. • In general, assigning prior probabilities of zero or one is unwise.

  20. Applying Bayes’ theorem to a decision problem • Decision Low sales High sales Hold small stocks $80000 $140000 Hold large stocks $20000 $220000 Profit $20000 $80000 $140000 $220000 Utility 0 0.5 0.8 1.0

  21. The retailer estimates that there is a 0.4 probability that sales will be low and a 0.6 probability that they will be high. What level of stocks should he hold? In Figure 8.7(a), It can be seen that his expected utility is maximized if he decides to hold a small stock of the commodity.

  22. The retailer’s problem with prior probabilities

  23. Before implementing his decision the retailer receives a sales forecast which suggests that sales will be high. • P(forecast of high sales|high sales)=0.75 • P(forecast of high sales|high sales)=0.2

  24. Applying Bayes’ theorem to the retailer’s problem

  25. Applying posterior probabilities to the retailer’s problem

  26. Assessing the value of new information New information can remove or reduce the uncertainty involved in a decision and thereby increase the expected payoff. Whether it is worth obtaining the information in the first place or, if there are several potential sources of information, which one is to be preferred.

  27. The expected value of perfect information • The concept of the expected value of perfect information (EVPI) can still be useful. • A problem is used to show how the value of perfect information can be measured. • For simplicity, we will assume that the decision maker is neutral to risk so that the expected monetary value criterion can be applied. • Refer to the following figure. (Descriptions are in page 227)

  28. Determining the EVPI (Fig. 8.8)

  29. Calculating the EVPI

  30. If the test is perfectly accurate it would not be worth paying them more than $15 000. It is likely that the test will be less than perfect, in which case the information it yields will be of less value. Nevertheless, the EVPI can be very useful in giving an upper bound to the value of new information.

  31. If the manager is risk averse or risk seeking or if he also has non-monetary objectives then it may be worth him paying more or less than this amount.

  32. The expected value of imperfect information • Suppose that, after making further enquiries, the farm manager discovers that the Ceres test is not perfectly reliable. • If the virus is still present in the soil the test has only a 90% chance of detecting it, while if the virus has been eliminated there is a 20% chance that the test will incorrectly indicate its presence. • How much would it now be worth paying for the test?

  33. Deciding whether to buy imperfect information

  34. If test indicates virus is present

  35. If test indicates virus is absent

  36. Determining the EVII

  37. Expected profit with imperfect information = $62 155 Expected profit without the information = $57 000 Expected value of imperfect information (EVII) = $5 155 Refer to Page 232

  38. It would not, therefore, be worth paying Ceres more than $5155 for the test. You will recall that the expected value of perfect information was $15 000, so the value of information from this test is much less than that from a perfectly reliable test. Of course, the more reliable the new information, the closer its expected value will be to the EVP1.

  39. A summary of the main stages • (1) Determine the course of action which would be chosen using only the prior probabilities and record the expected payoff of this course of action; • (2) Identify the possible indications which the new information can give; • (3) For each indication: (a) Determine the probability that this indication will occur; (b) Use Bayes' theorem to revise the probabilities in the light of this indication; (c) Determine the best course of action in the light of this indication (i.e. using the posterior probabilities) and the expected payoff of this course of action;

  40. (4) Multiply the probability of each indication occurring by the expected payoff of the course of action which should be taken if that indication occurs and sum the resulting products. This will give the expected payoff with imperfect information; (5) The expected value of the imperfect information is equal to the expected payoff with imperfect information (derived in stage 4) less the expected payoff of the course of action which would be selected using the prior probabilities (which was derived in stage 1).

More Related