1 / 10

Conditional Probability

Conditional Probability. Notes from Stat 391. Conditional Events. We want to know P(A) for an experiment Can event B influence P(A)? Definitely! Assume B is an experimental condition We convey the dependence by P(A|B) Means probability of A given B. Rain Example

Download Presentation

Conditional Probability

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Conditional Probability Notes from Stat 391

  2. Conditional Events • We want to know P(A) for an experiment • Can event B influence P(A)? • Definitely! • Assume B is an experimental condition • We convey the dependence by P(A|B) • Means probability of A given B

  3. Rain Example Probability that it will rain in the US in any city = P(rain) = 0.3 P(rain|Seattle) = 0.5 P(rain|Phoenix) = 0.01 P(rain|Seattle, summer) = .25 Image Example (contrived) P(structure) = 0.4 P(structure|organized lines) = 0.85 Examples of Dependence

  4. Properties of Conditional Probability • For disjoint sets, • P(A v C|B) = P(A v C, B)/P(B) = P(A, B) + P(C, B) / P(B) = P(A|B) + P(C|B) • Note: Under P(*|B), all outcomes that do not include B have a 0 probability • If A  B, then • P(A|B) = P(A,B)/P(B) = P(A)/P(B) > P(A) • (i.e., if A↦ B, and B occurs, then the probability of A is increased)

  5. More Properties of Conditional Probability • If B  A, then • P(A|B) = P(A, B)/P(B) = P(B)/P(B) = 1 • If B ↦ A, then B occurring makes A certain • If B  A = , then • P(A|B) = P(A, B)/P(B) = 0/P(B) = 0 • If A is conditioned on B, but A and B can never occur together, then clearly the probability of A when conditioned on B must be 0

  6. Conditioning on Several Events • P(A|B,C) = P(A, B, C)/ P(B, C) = P(A, B|C)P(C)/P(B|C)P(C) = P(A, B|C)/P(B|C) • Same idea as before, but with C as context

  7. Law of Total Probability • Law of Total Probability • Joint probability of A and B • P(A, B) = P(B)P(A|B) • Marginal probability of A • P(A) = P(A, B) + P(A, B^c) • P(A) = P(B)P(A|B) + P(B^c)P(A|B^c) • Example: Alice at a party (ex. 6, pg. 5)

  8. Bayes’ Rule • From P(A, B) = P(B)P(A|BP(A|B) = P(A)P(B|A)/P(B), we get Bayes’ Rule • P(A|B) = P(A)P(B|A)/P(B) • See court example (pg 6) for applicability

  9. Independence • Independence means that knowing information about one event has no effect on the probability of the other • Formally, P(A, B) = P(A)P(B) • Or, P(A|B) = P(A) • Often expressed as A  B to mean “A independent of B”

  10. Example of Independence • Coin tosses are mutually independent • i.e., knowing the outcome of one coin toss doesn’t change the probability of the outcome for another coin toss • Mutually independent events imply pair-wise independence • Pair-wise independence doesn’t imply mutual independence

More Related