1 / 20

Probability Calculations

Learn about probability calculations and its applications in artificial intelligence exams. Understand concepts such as conditional probability, joint probability, and Bayes' theorem.

abrahamm
Download Presentation

Probability Calculations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Probability Calculations Matt Huenerfauth CSE-391: Artificial IntelligenceUniversity of Pennsylvania April 2005

  2. Exams • The makeup midterm exam will be on Friday at 3pm. If you’ve already talked with Dr. Palmer about this, then you should have received an e-mail about this. • The final exam for CSE-391 will be on May 2, 2005, from 8:30am to 10:30am. • We don’t yet know the room.

  3. Probability Problems

  4. Probability Calculations • What do these notations mean? A P( A ) P( A ) P( A  B ) P( A  B ) P( A | B ) H P(H = h) Boolean Random Variable Unconditional Probability. The notation P(A) is a shortcut for P(A=true). Probability of A or B: P(A) + P(B) – P(A  B) Joint Probability. Probability of A and B together. Probability of A given that we know B is true. Non-Boolean Random Variable Probability H has some value

  5. Axioms of Probability 1 out of 3 0  P(A)  1 P(true) = 1 P(false) = 0 P( A  B ) = P(A) + P(B) – P(A  B) 0.98 45.65%

  6. Product Rule P(A B) = P(A|B) P(B) P(A|B) = P(A B) P(B) So, if we can find two of these values someplace (in a chart, from a word problem), then we can calculate the third one.

  7. Using the Product Rule • When there’s a fire, there’s a 99% chance that the alarm will go off. P( A | F ) • On any given day, the chance of a fire starting in your house is 1 in 5000. P( F ) • What’s the chance of there being a fire and your alarm going off tomorrow? P( A  F ) = P( A | F ) * P( F )

  8. Conditioning • Sometimes we call the 2nd form of the product rule the “conditioning rule” because we can use it to calculate a conditional probability from a joint probability and an unconditional one. P(A|B) = P(A B) P(B)

  9. Word Problem • Out of the 1 million words in some corpus, we know that 9100 of those words are “to” being used as a PREPOSITION. P( PREP  “to” ) • Further, we know that 2.53% of all the words that appear in the whole corpus are the word “to”. P( “to” ) • If we are told that some particular word in a sentence is “to” but we need to guess what part of speech it is, what is the probability the word is a PREPOSITION? What is P( PREP | “to” ) ? Just calculate: P(PREP|“to”) = P(PREP“to”) / P(“to”)

  10. Marginalizing What if we are told only joint probabilities about a variable H=h, is there a way to calculate an unconditional probability of H=h? Yes, when we’re told the joint probabilities involving every single value of the other variable…

  11. Word Problem • We have an AI weather forecasting program.We tell it the following information about this weekend… We want it to tell us the chance of rain. • Probability that there will be rain and lightening is 0.23. P( rain=true  lightening=true ) = 0.23 • Probability that there will be rain and no lightening is 0.14. P( rain=true  lightening=false ) = 0.14 • What’s the probability that there will be rain? P(rain=true) ? Lightening is only ever true or false. P(rain=true) = 0.23 + 0.14 = 0.37

  12. Marginalizing

  13. P(AB) Joint Probability Distribution Table • How could we calculate P(A)? • Add up P(AB) and P(A¬B). • Same for P(B). • How about P(AB)? • Two options… • We can read P(AB) from chart and find P(A) and P(B).P(AB)=P(A)+P(B)-P(AB) • Or just add up the proper three cells of the table. Each cell contains a ‘joint’probability of both occurring.

  14. Chain Rule • Is there a way to calculate a really big joint probability if we know lots of different conditional probabilities? • P(f1f2f3f4 … fn-1fn) = P(f1) * • P(f2 | f1) * • P(f3 | f1f2 ) * • P(f4 | f1f2f3) * • . . . • . . . • P(fn | f1f2f3f4...fn-1)

  15. Chain Rule • Is there a way to calculate a really big joint probability if we know lots of different conditional probabilities? • P(f1f2f3f4 … fn-1fn) = P(f1) * • P(f2| f1) * • P(f3 | f1f2) * • P(f4 | f1f2f3) * • . . . • . . . • P(fn | f1f2f3f4...fn-1)

  16. Word Problem • If we have a white ball, the probability it is a baseball is 0.76. P( baseball | white  ball ) • If we have a ball, the probability it is white is 0.35. P(white | ball) • The probability we have a ball is 0.03. P(ball) • So, what’s the probability we have a white ball that is a baseball? P(white  ball  baseball) = 0.76 * 0.35 * 0.03

  17. Bayes’ Rule • Bayes’ Theorem relates conditional probability • distributions: • P(h | e) = P(e | h) * P(h) • P(e) • or with additional conditioning information: • P(h | e  k) = P(e | h  k) * P(h | k) • P(e | k)

  18. Word Problem • The probability I think that my cup of coffee tastes good is 0.80. P(G) = .80 • I add Equal to my coffee 60% of the time. P(E) = .60 • I think when coffee has Equal in it, it tastes good 50% of the time. P(G|E) = .50 • If I sip my coffee, and it tastes good, what are the odds that it has Equal in it? P(E|G) = P(G|E) * P(E) / P(G)

  19. Bayes’ Rule • Bayes’ Theorem relates conditional probability • distributions: • P(h | e) = P(e | h) * P(h) • P(e) • or with additional conditioning information: • P(h | e  k) = P(e | h  k) * P(h | k) • P(e | k)

  20. The Skills You Need • Probability Calculations • Basic Definitions. • Axioms. • Notation. • Rules, Word Problems. • Interpreting Charts. • Working with Bayes’ Rule. • “Diagnosis” Problems.

More Related