1 / 31

Chapter 3

Chapter 3. The most important questions of life are, for the most part, really only problems of probability.--Marquis de LaPlace Probability Concepts: Quantifying Uncertainty. The Language of Probability. Probability as Long-Run Frequency Pr[head] = .50 because .50 is long-run

brilliant
Download Presentation

Chapter 3

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 3 The most important questions of life are, for the most part, really only problems of probability.--Marquis de LaPlace Probability Concepts: Quantifying Uncertainty

  2. The Language of Probability Probability as Long-Run Frequency Pr[head] = .50 because .50 is long-run frequency for getting a head in many tosses The Random Experiment and Its Elementary Events The elementary events of a random experiment are all the outcomes. Examples: Coin toss: {head, tail} Number of arrivals: {0,1,2,...} Playing card: {♥K, ♦K, ♣K, ♠K,..., ♣A, ♠A} A complete collection is the sample space.

  3. The Language of Probability Certain events: Pr[certain event] = 1 (occurs every time) “head or tail” “success or failure” “Dow’s next close is some value.” Impossible events: Pr[impossible event] = 0 (never occurs) “Coin toss has no outcome.” “Both success and failure.” “Dow hits minus 500.”

  4. Finding a Probability • Count-and-Divide • For a randomly selected playing card, Pr[♥Q]=1/52, Pr[queen]=4/52, Pr[♥]=13/52 • Works only when Elementary Events are equally likely. • Must be able to count possibilities. • May often be logically deduced. • Historical Frequencies • Pr[house burns down]=.003 because in any past year about three out of thousand such homes did burn down.

  5. Finding a Probability • Application of a Probability Law • Multiplication law • Addition law Apply to composite events. Must know separate individual probabilities first. • Pull Out of “Thin Air” Using Judgment • Pr[Dow rises over 100 tomorrow]=.17 Above is a subjective probability. It measures one person’s “strength of conviction” that the event will occur.

  6. Types of Random Experiments and Probabilities • Repeatable Random Experiments • Producing a microcircuit (satisfactory or defective). • Random arrival of a customer in any minute. P[satisfactory] is a long-run frequency and an objective probability. • Non-repeatable Random Experiments • A product is launched and might be a success. • Completing the course (possibly with an “A”.) Pr[A] is a judgmental assessment and is a subjective probability.

  7. Joint Probability Table I • Start with a Cross Tabulation:

  8. Joint Probability Table I The respective joint probabilities appear in the interior cells. • Count and Divide: The marginal probabilities appear in the margin for the respective row or column.

  9. Joint and Marginal Probabilities • The joint probabilities involve “and” for two or more events: • Pr[M and U] = 24/72 • Pr[G and F] = 14/72 • The marginal probabilities involve single events: • Pr[M] = 40/72 Pr[F] = 32/72 • Pr[U] = 42/72 Pr[G] = 30/72 • The term “marginal probability” derives from the position of that particular value: It lies in the margin of the joint probability table.

  10. The Multiplication Law for Finding “And” Probabilities • Multiplication Law for Independent Events Pr[A and B] = Pr[A] × Pr[B] • Example: Two Lop-Sided Coins are tossed. Coin 1is concave, so that Pr[H1]=.15 (estimated from repeated tosses) Coin 2 is altered, so that Pr[H2]=.60 Pr[both heads]=Pr[H1andH2] =Pr[H1]×Pr[H2] =.15×.60=.09

  11. Need for Multiplication Law • Use the multiplication law when counting and dividing won’t work, because: • Elementary events are not equally likely. • Sample space is impossible to enumerate. • Only component event probabilities are known. This multiplication law only works for independent events. That would be the case only if H2 were unaffected by the occurrence of H1. That would not be the case, for example, if the altered coin were tossed only if the first coin was a tail.

  12. The Addition Law for Finding “Or” Probabilities • The following addition law applies to mutually exclusive events: Pr[A or B or C] = Pr[A] + Pr[B] + Pr[C] • For example, suppose the 72 students on a previous slide involved 16 accounting majors, 24 finance majors, and 20 market- ing majors. One is chosen randomly.Then, Pr[accounting or finance or marketing] = Pr[accounting]+Pr[finance]+Pr[marketing] = 16/72 + 24/72 + 20/72 = 60/72

  13. Why Have the Addition Law? • The addition law is needed when: • Only the component probabilities are known. • It is faster or simpler than counting and dividing. • There is insufficient information for counting and dividing. (It was not essential for the preceding example, since we could have instead used the fact that 60 out of 72 students had the majors in question.) The preceding addition law requires mutually exclusive events. (Joint majors invalidate it.)

  14. Must We Use a Law? • Use a probability law only when it is helpful. • Counting and dividing may be easier to do. • But, a law may not work. For example: Pr[Queen and Face card] ≠ Pr[Queen] × Pr[Face] (because Queen and Face are not independent) Pr[Queen or♥] ≠ Pr[Queen] + Pr[♥] (because Queen and ♥ overlap with the ♥Q, making then not mutually exclusive)

  15. Some Important PropertiesResulting from Addition Law • The cells of a joint probability table sum to the respective marginal probabilities. Thus, • Pr[M] = Pr[M and U] + Pr[M and G] • Pr[G] = Pr[M and G] + Pr[F and G] • When events A, B, C are both collectively exhaustive and mutually exclusive: • Pr[A or B or C] = 1 (due to certainty) • Pr[A] + Pr[B] + Pr[C] = 1 (by addition law) • Complementary Events: Pr[A or not A] =1 • Pr[A] + Pr[not A] = 1 and Pr[A] = 1 – Pr[not A]

  16. Independence Defined • A and B are independent whenever Pr[A and B] = Pr[A] × Pr[B] and otherwise not independent. • A and B are independent if Pr[A] is always the same regardless of whether: • B occurs. • B does not occur. • Nothing is known about the occurrence or non-occurrence of B.

  17. Establishing Independence • Yes, if multiplication gives correct result. • 2 fair coin tosses: Pr[H1]×Pr[H2] = Pr[H1 & H2] H1 and H2 are independent if above is true. • No, otherwise. Consider Queen and Face card: Pr[Face card] = 12/52 Pr[Queen] = 4/52 The product 12/52 × 4/52 = .018 is not equal to Pr[Face card and Queen] = 4/52 = .077 • Can be self-evident (assumed): • Sex of two randomly chosen people. • Person’s height and political affiliation.

  18. Conditional Probability • The conditional probability of event A given event B is denoted as Pr[A | B]. • Pr[head | tail] = 0 (single toss) • Pr[cloudy | rain] = 1 (always) • Pr[rain | cloudy] = .15 (could be) • Pr[Queen | Face card] = 4/12 (regular cards) • Pr[get A for course | 100% on final] = .85 • Pr[pass screen test | good performance] = .65 • Pr[good performance | pass screen test] = .90

  19. Independence and Conditional Probability • Two events A, B are independent if Pr[A | B] = Pr[A] and when the above is true, so must be Pr[B | A] = Pr[B] • Independence can be tested by comparing the conditional and unconditional probabilities. Thus, since Pr[Queen] = 4/52 ≠ 4/12 = Pr[Queen | Face] the events Queen and Face are not independent.

  20. General Multiplication Law • The general multiplication law is: Pr[A and B] = Pr[A] × Pr[B | A] Or = Pr[B] × Pr[A | B] This is the most important multiplication law, because it always works.

  21. Applying General Multiplication Law Two out of 10 recording heads are tested and destroyed. 2 are defective (D), 8 satisfactory (S). With 1 and 2 denoting successive selections, Pr[D1] = 2/10 and Pr[D2 | D1] = 1/9 Pr[S1] = 8/10 and Pr[D2 | S1] = 2/9 Thus, Pr[D1and D2] = Pr[D1] × Pr[D2 | D1] = 2/10 × 1/9 = 2/90 Pr[S1and D2] = Pr[S1] × Pr[D2 | S1] = 8/10 × 2/9 = 16/90 And, Pr[D2] = Pr[D1and D2] + Pr[S1and D2] = 2/90 + 16/90 = 18/90 = .2

  22. Conditional Probability Identity Dividing both sides of the expression for the multiplication law and rearranging terms, establishes the conditional probability identity: The above might not provide the answer. We Couldn’t use it to find Pr[D2 | D1] because that was used to compute Pr[D1 and D2], and Pr[D2 | D1] = 1/9 followed from 1 out of 9 of items available for final selection being defective.

  23. Joint Probability Table II • Use Probability Laws to Construct: • Knowing that • 5% of Gotham City adults drive drunk (D) • 12% are alcoholics (A) • 40% of all drunk drivers are alcoholics • We have for a randomly selected adult • Pr[D] = .05 Pr[A] = .12 • Pr[A | D] = .40 • The multiplication law provides: • Pr[Aand D] = Pr[D]×Pr[A | D] = .05(.40) = .02 • And the following joint probability table is constructed.

  24. Joint Probability Table II • The blue values were added using addition law properties.

  25. Probability Trees

  26. Prior and Posterior Probabilities • A prior probability pertains to a main event: • And may be found by judgment (subjective). • Or, by logic or history (objective). • Example: Pr[oil]=.10 (subjective) • A posterior probability is a conditional probability for the main event given a particular experimental result. For example, • Pr[oil | favorable seismic] = .4878 (greater) • Pr[oil | unfavorable seismic] = .0456 (smaller) Favorable result raises main event’s probability.

  27. Conditional Result Probabilities • Posterior probabilities are computed. That requires a conditional result probability: • Pr[result | event] (like past “batting average”) • Example: 60% of all known oil fields have yielded favorable seismics, and Pr[favorable | oil] = .60 (historical & objective) and 7% of all dry holes have yielded favorable seismics, so that Pr[favorable | not oil] = .07 (objective)

  28. Bayes’ Theorem:Computing Posterior Probability • Several laws of probability combine to merge the components to compute the posterior probability: For example,

  29. Computing Posterior Probability • Example, continued: = .0456 • Posterior probabilities might be computed from the conditional probability identity:

  30. Computing Posterior Probability from Joint Probability Table • Fill in a joint probability table. Black values are given. Apply multiplication & addition laws (blue).

  31. Computing Posterior Probability from Joint Probability Table • Apply conditional probability identity, lifting needed values from table: • Pr[O | F] = .06/.123 = .4878 • Pr[O | not F] = .04/.877 = .0456 • The following are known values: • Prior probabilities (from early judgment). • Conditional result probabilities (from testing the tester).

More Related