1 / 14

Approximate Inference in Bayesian Networks: Sampling Algorithms and Extensions

This chapter discusses various sampling algorithms for approximate inference in Bayesian Networks, including direct sampling, rejection sampling, likelihood sampling, and Markov Chain Monte Carlo (MCMC) simulation. It also explores extending probability to first-order representations and other approaches to uncertain reasoning such as Dempster-Shafer Theory and Fuzzy Logic.

jpena
Download Presentation

Approximate Inference in Bayesian Networks: Sampling Algorithms and Extensions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. March 2, 2004 Chapter 14: Probabilistic Reasoning

  2. 14.5 Approximate Inference in Bayesian Networks • Monte Carlo Algorithms • randomized sampling • accuracy depends on number of samples generated • e.g., estimating PI

  3. Direct Sampling • Figure 14.2 • P(x1, … xn) = N(x1, … xn) / Total Samples

  4. Rejection Sampling • Used to produce samples from a hard-to-sample distribution, given an easy-to-sample distribution • Figure 14.3 • Drawback: can generate useless samples

  5. Likelihood Sampling • Generates only events that are consistent with the evidence • Figure 14.14 • Page 514

  6. Markov Chain Simulation • MCMC (Markov Chain Monte Carlo) • Make random changes to previous state • Sampling process settles into a dynamic equilibrium in which the long-run fraction of time spent in each state is exactly proportional to its posterior probability

  7. MCMC • Figure 14.15 • Page 516

  8. 14.6 Extending Probability to First Order Representations • RPM: Relational Probability Model • Skip!

  9. 14.7 Other Approaches to Uncertain Reasoning • Dempster-Shafer Theory • Fuzzy Logic

  10. Dempster-Shafer Theory • Notion of “ignorance” instead of “uncertainty” • reliability of Betty = 90% • reliability of Sally = 80% • Betty and Sally both say a limb fell on my car

  11. Dempster-Shafer Theory • Bel(limb didn’t fall on car) = .1*.2 • Bel(limb fell on car) = 1 - .1*.2 • Betty says its did, Sally says it didn’t • only Betty reliable .9 * .2 = .18 • only Sally reliable .1 * .8 = .08 • neither reliable .1 * .2 = .02 • Bel(limb fell on car) = 18/28 • Bel(limb didn’t fall on car) = 8/28

  12. Fuzzy Logic • Degree of membership • Typical membership function for concept of “warm” • Union • Intersection • Negation

  13. Fuzzy Logic Applications • Air conditioning • Cruise control • Ship boilers • Video cameras • Washing Machines

  14. Fuzzy System Construction • Fuzzificiation • Inference • Composition • Defuzzification

More Related