html5-img
1 / 51

Probabilistic Inference

Probabilistic Inference. Agenda. Review of probability theory Joint, conditional distributions Independence Marginalization and conditioning Intro to Bayesian Networks. Probabilistic Belief. Consider a world where a dentist agent D meets with a new patient P

Download Presentation

Probabilistic Inference

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Probabilistic Inference

  2. Agenda • Review of probability theory • Joint, conditional distributions • Independence • Marginalization and conditioning • Intro to Bayesian Networks

  3. Probabilistic Belief • Consider a world where a dentist agent D meets with a new patient P • D is interested in only whether P has a cavity; so, a state is described with a single proposition – Cavity • Before observing P, D does not know if P has a cavity, but from years of practice, he believes Cavity with some probability p and Cavity with probability 1-p • The proposition is now a boolean random variable and (Cavity, p) is a probabilistic belief

  4. Cavity Cavity p 1-p Probabilistic Belief State • The world has only two possible states, which are respectively described by Cavity and Cavity • The probabilistic belief state of an agent is a probabilistic distribution over all the states that the agent thinks possible • In the dentist example, D’s belief state is:

  5. Multivariate Belief State • We now represent the world of the dentist D using three propositions – Cavity, Toothache, and PCatch • D’s belief state consists of 23 = 8 states each with some probability: {CavityToothachePCatch,CavityToothachePCatch, CavityToothachePCatch,...}

  6. The belief state is defined by the full joint probability of the propositions Toothache Toothache

  7. Probabilistic Inference Toothache Toothache P(Cavity Toothache) = 0.108 + 0.012 + ... = 0.28

  8. Probabilistic Inference Toothache Toothache P(Cavity) = 0.108 + 0.012 + 0.072 + 0.008 = 0.2

  9. Probabilistic Inference Toothache Toothache Marginalization: P(c) = StSpcP(ctpc) using the conventions that c = Cavity or Cavity and that St is the sum over t = {Toothache, Toothache}

  10. Komolgorov’s Probability Axioms • 0  P(a)  1 • P(true) = 1, P(false) = 0 • P(a  b) = P(a) + P(b) - P(a  b)

  11. Decoding Probability Notation • Capital letters A,B,C denote random variables • Each random variable X can take one of a set of possible values xVal(X) • Confusingly, this is often left implicit • Probabilities defined over logical statements, e.g., P(X=x) • Confusingly, this is often written P(x)…

  12. Decoding Probability Notation • Interpretation of P(A) depends on if A treated as a logical statement, or a random variable • As a logical statement • Just the probability that A is true • P(A) = 1-P(A) • As a random variable • Denotes both P(A is true) and P(A is false) • For non-boolean variables, denotes P(A=a) for all a  Val(A) • Computation requires marginalization • Belief space A, B, C, D, … often left implicit • Marginalize the joint distribution over all combinations of B, C, D… (event space)

  13. Decoding Probability Notation • How to interpret P(AB) = P(A,B)? • As a logical statement • Probability that both A and B are true • As random variables • {P(A=0,B=0), P(A=1,B=0), P(A=0,B=1), P(A=1,B=1)} • P(A=a,B=b) for all aVal(A), bVal(B) in general • Equal to P(A)P(B)? No!!! (except under very special conditions)

  14. Quiz: What does this mean? P(AB) = P(A)+P(B)- P(AB) P(A=a  B=b) = P(A=a) + P(B=b) • P(A=a  B=b) For all aVal(A) and bVal(B)

  15. Marginalization • Express P(A,B) in terms of P(A,B,C) • If C is boolean, • P(A,B) = P(A,B,C=0)+ P(A,B,C=1) • In general • P(A,B) = ScVal(C) P(A,B,C=c)

  16. Conditional Probability • P(AB) = P(A|B) P(B) = P(B|A) P(A)P(A|B) is the posterior probability of A given B • Axiomatic definition: P(A|B) = P(A,B)/P(B)

  17. Toothache Toothache P(Cavity|Toothache) = P(CavityToothache)/P(Toothache) = (0.108+0.012)/(0.108+0.012+0.016+0.064) = 0.6 Interpretation: After observing Toothache, the patient is no longer an “average” one, and the prior probability (0.2) of Cavity is no longer valid P(Cavity|Toothache) is calculated by keeping the ratios of the probabilities of the 4 cases unchanged, and normalizing their sum to 1

  18. normalizationconstant Toothache Toothache P(Cavity|Toothache) = P(CavityToothache)/P(Toothache) = (0.108+0.012)/(0.108+0.012+0.016+0.064) = 0.6 P(Cavity|Toothache)=P(CavityToothache)/P(Toothache) = (0.016+0.064)/(0.108+0.012+0.016+0.064) = 0.4 P(c|Toochache) = (P(Cavity|Toothache), P(Cavity|Toothache)) = aP(c Toothache) = aSpcP(cToothache  pc) = a[(0.108, 0.016) + (0.012, 0.064)] = a(0.12, 0.08) = (0.6, 0.4)

  19. Conditioning • Express P(A) in terms of P(A|B), P(B) • If C is boolean, • P(A) = P(A|B=0) P(B=0)+ P(A|B=1) P(B=1) • In general • P(A) = SbVal(B) P(A|B=b) P(B=b)

  20. Conditional Probability • P(AB) = P(A|B) P(B) = P(B|A) P(A) • P(ABC) = P(A|B,C) P(BC) = P(A|B,C) P(B|C) P(C) • P(Cavity) = StSpcP(Cavitytpc) = StSpcP(Cavity|t,pc) P(tpc)

  21. Independence • Two random variables A and B are independent if P(AB) = P(A) P(B) hence if P(A|B) = P(A) • Two random variables A and B are independent given C, if P(AB|C) = P(A|C) P(B|C)hence if P(A|B,C) = P(A|C)

  22. Updating the Belief State Toothache Toothache • Let D now observe Toothache with probability 0.8 (e.g., “the patient says so”) • How should D update its belief state?

  23. Updating the Belief State Toothache Toothache • Let E be the evidence such that P(Toothache|E) = 0.8 • We want to compute P(ctpc|E) = P(cpc|t,E) P(t|E) • Since E is not directly related to the cavity or the probe catch, we consider that c and pc are independent of E given t, hence: P(cpc|t,E) = P(cpc|t)

  24. 0.432 0.048 0.018 0.002 0.064 0.256 0.036 0.144 To get these 4 probabilities we normalize their sum to 0.8 To get these 4 probabilities we normalize their sum to 0.2 Updating the Belief State Toothache Toothache • Let E be the evidence such that P(Toothache|E) = 0.8 • We want to compute P(ctpc|E) = P(cpc|t,E) P(t|E) • Since E is not directly related to the cavity or the probe catch, we consider that c and pc are independent of E given t, hence: P(cpc|t,E) = P(cpc|t)

  25. Issues • If a state is described by n propositions, then a belief state contains 2n states (possibly, some have probability 0) •  Modeling difficulty: many numbers must be entered in the first place •  Computational issue: memory size and time

  26. Bayes’ Rule and other Probability Manipulations • P(AB) = P(A|B) P(B) = P(B|A) P(A) • P(A|B) = P(B|A) P(A) / P(B) • Can manipulate distributions, e.g.P(B) = Sa P(B|A=a) P(A=a) • Can derive P(A|B), P(B) using only P(B|A) and P(A) • So, if any variables are conditionally independent, we should be able to decompose joint distribution to take advantage of it

  27. Toothache Toothache • Toothache and PCatch are independent given Cavity (or Cavity), but this relation is hidden in the numbers ![Verify this] • Bayesian networks explicitly represent independence among propositions to reduce the number of probabilities defining a belief state

  28. Bayesian Network • Notice that Cavity is the “cause” of both Toothache and PCatch, and represent the causality links explicitly • Give the prior probability distribution of Cavity • Give the conditional probability tables of Toothache and PCatch P(ctpc) = P(tpc|c) P(c) = P(t|c) P(pc|c) P(c) Cavity Toothache PCatch 5 probabilities, instead of 7

  29. Conditional Probability Tables P(ctpc) = P(tpc|c) P(c) = P(t|c) P(pc|c) P(c) Cavity Toothache PCatch Rows sum to 1 If X takes n values, just store n-1 entries

  30. Naïve Bayes Models • P(Cause,Effect1,…,Effectn)= P(Cause) Pi P(Effecti | Cause) Cause Effect1 Effect2 Effectn

  31. P(C|F1,….,Fk) = P(C,F1,….,Fk)/P(F1,….,Fk) = 1/Z P(C)Pi P(Fi|C) Given features, what class? Naïve Bayes Classifier • P(Class,Feature1,…,Featuren)= P(Class) Pi P(Featurei | Class) Spam / Not Spam English / French/ Latin … Class Feature1 Feature2 Featuren Word occurrences

  32. Equations Involving Random Variables • C = A  B • C = max(A,B) • Constrains joint probability P(A,B,C) • Nicely encoded as causality relationship A B Conditional probability given by equation rather than a CPT C

  33. Burglary Earthquake causes Alarm effects JohnCalls MaryCalls A More Complex BN Intuitive meaning of arc from x to y: “x has direct influence on y” Directed acyclic graph

  34. Burglary Earthquake Alarm JohnCalls MaryCalls A More Complex BN Size of the CPT for a node with k parents: 2k 10 probabilities, instead of 31

  35. Each of the beliefs JohnCalls and MaryCalls is independent of Burglary and Earthquake given Alarm or Alarm Burglary Earthquake Alarm JohnCalls MaryCalls What does the BN encode? P(bj) P(b) P(j) P(bj|a) = P(b|a) P(j|a) For example, John doesnot observe any burglariesdirectly

  36. The beliefs JohnCalls and MaryCalls are independent given Alarm or Alarm Burglary Earthquake Alarm JohnCalls MaryCalls What does the BN encode? P(bj|a) = P(b|a) P(j|a) P(jm|a) = P(j|a) P(m|a) A node is independent of its non-descendants given its parents For instance, the reasons why John and Mary may not call if there is an alarm are unrelated

  37. The beliefs JohnCalls and MaryCalls are independent given Alarm or Alarm Burglary Earthquake Alarm JohnCalls MaryCalls What does the BN encode? Burglary and Earthquake are independent A node is independent of its non-descendants given its parents For instance, the reasons why John and Mary may not call if there is an alarm are unrelated

  38. Locally Structured World • A world is locally structured (or sparse) if each of its components interacts directly with relatively few other components • In a sparse world, the CPTs are small and the BN contains much fewer probabilities than the full joint distribution • If the # of entries in each CPT is bounded by a constant, i.e., O(1), then the # of probabilities in a BN is linear in n – the # of propositions – instead of 2n for the joint distribution

  39. But does a BN represent a belief state?In other words, can we compute the full joint distribution of the propositions from it?

  40. Burglary Earthquake Alarm JohnCalls MaryCalls Calculation of Joint Probability P(jmabe) = ??

  41. Burglary Earthquake Alarm JohnCalls MaryCalls • P(jmabe)= P(jm|a,b,e) P(abe)= P(j|a,b,e)  P(m|a,b,e) P(abe)(J and M are independent given A) • P(j|a,b,e) = P(j|a)(J and B and J and E are independent given A) • P(m|a,b,e) = P(m|a) • P(abe) = P(a|b,e)  P(b|e) P(e) = P(a|b,e)  P(b) P(e)(B and E are independent) • P(jmabe) = P(j|a)P(m|a)P(a|b,e)P(b)P(e)

  42. Burglary Earthquake Alarm JohnCalls MaryCalls Calculation of Joint Probability P(JMABE)= P(J|A)P(M|A)P(A|B,E)P(B)P(E)= 0.9 x 0.7 x 0.001 x 0.999 x 0.998= 0.00062

  43. Burglary Earthquake Alarm P(x1x2…xn) = Pi=1,…,nP(xi|parents(Xi)) JohnCalls MaryCalls  full joint distribution table Calculation of Joint Probability P(jmabe)= P(j|a)P(m|a)P(a|b,e)P(b)P(e)= 0.9 x 0.7 x 0.001 x 0.999 x 0.998= 0.00062

  44. Burglary Earthquake Alarm P(x1x2…xn) = Pi=1,…,nP(xi|parents(Xi)) JohnCalls MaryCalls  full joint distribution table Calculation of Joint Probability Since a BN defines the full joint distribution of a set of propositions, it represents a belief state P(jmabe)= P(j|a)P(m|a)P(a|b,e)P(b)P(e)= 0.9 x 0.7 x 0.001 x 0.999 x 0.998= 0.00062

  45. Cavity Toothache Querying the BN • The BN gives P(t|c) • What about P(c|t)? • P(Cavity|t) = P(Cavity  t)/P(t) = P(t|Cavity) P(Cavity) / P(t)[Bayes’ rule] • P(c|t) = a P(t|c) P(c) • Querying a BN is just applying the trivial Bayes’ rule on a larger scale… algorithms next time

  46. Battery Gas Radio SparkPlugs Starts Moves More Complicated Singly-Connected Belief Net

  47. Some Applications of BN • Medical diagnosis • Troubleshooting of hardware/software systems • Fraud/uncollectible debt detection • Data mining • Analysis of genetic sequences • Data interpretation, computer vision, image understanding

  48. Region = {Sky, Tree, Grass, Rock} R1 Above R2 R4 R3

  49. BN to evaluate insurance risks

  50. Purposes of Bayesian Networks • Efficient and intuitive modeling of complex causal interactions • Compact representation of joint distributions O(n) rather than O(2n) • Algorithms for efficient inference with new evidence (more on this next time)

More Related