1 / 120

Bayesian Networks Conditional Independence Creating Tables Notations for Bayesian Networks

Used in Spring 2012, Spring 2013, Winter 2014 (partially). Bayesian Networks Conditional Independence Creating Tables Notations for Bayesian Networks Calculating conditional probabilities from the table s Calculating conditional independence Markov Chain Monte Carlo Markov Models.

ebuckman
Download Presentation

Bayesian Networks Conditional Independence Creating Tables Notations for Bayesian Networks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Used in Spring 2012, Spring 2013, Winter 2014 (partially) Bayesian Networks Conditional Independence Creating Tables Notations for Bayesian Networks Calculating conditional probabilities from the tables Calculating conditional independence Markov Chain Monte Carlo Markov Models. Markov Models and Probabilistic methods in vision

  2. Introduction to Probabilistic Robotics • Probabilities • Bayes rule • Bayes filters • Bayes networks • Markov Chains new next

  3. Bayesian Networks and Markov Models • Bayesian networks and Markov models : • Applications in User Modeling • Applications in Natural Language Processing • Applications in robotic control • Applications in robot Vision

  4. Bayesian Networks (BNs) – Overview • Introduction to BNs • Nodes, structure and probabilities • Reasoning with BNs • Understanding BNs • Extensions of BNs • Decision Networks • Dynamic Bayesian Networks (DBNs)

  5. Definition of Bayesian Networks • A data structure that represents the dependence between variables • Gives a concise specification of the joint probability distribution • A Bayesian Network is a directed acyclic graph (DAG) in which the following holds: • A set of random variables makes up the nodes in the network • A set of directed links connects pairs of nodes • Each node has a probability distribution that quantifiesthe effects of its parents

  6. Conditional Independence The relationship between : • conditional independence • and BN structure is important for understanding how BNs work

  7. A A B B C C Conditional Independence – Causal Chains • Causal chains give rise to conditional independence • Example: “Smoking causes cancer, which causes dyspnoea” smoking cancer dyspnoea

  8. Conditional Independence – Common Causes • Common Causes (or ancestors) also give rise to conditional independenceExample: “Cancer is a common cause of the two symptoms: a positive Xray and dyspnoea” cancer B dyspnoea Xray A C ( )  (A indep C) | B I have dyspnoea (C ) because of cancer (B) so I do not need an Xray test

  9. Conditional Dependence – Common Effects • Common effects (or their descendants) give rise to conditional dependence • Example: “Cancer is a common effect of pollution and smoking”Given cancer, smoking “explains away” pollution pollution A C ??? smoking B cancer pollution cancer ) (  We know that you smoke and have cancer, we do not need to assume that your cancer was caused by pollution

  10. Joint Distributions for describing uncertain worlds • Researchers found already numerous and dramatic benefits of Joint Distributions for describing uncertain worlds • Students in robotics and Artificial Intelligence have to understand problems with using Joint Distributions • You should discover how Bayes Net methodology allows us to build Joint Distributions in manageable chunks

  11. Bayes Net methodology Why Bayesian methods matter? • Bayesian Methods are one of the most important conceptual advances in the Machine Learning / AI field to have emerged since 1995. • A clean, clear, manageable language and methodology for expressing what the robot designer is certain and uncertain about • Already, many practical applications in medicine, factories, helpdesks: for instance • P(this problem | these symptoms) // we will use P as probability • anomalousness of this observation • choosing next diagnostic test | these observations

  12. Problem 1: Creating Joint Distribution Table • Joint Distribution Table is an important concept

  13. Probabilistic truth table • You can guess this table, you can take data from some statistics, • You can build this table based on some partial tables Truth table of all combinations of Boolean Variables

  14. Idea – use decision diagrams to represent these data.

  15. Use of independence while creating the tables

  16. Wet – Sprinkler – Rain Example

  17. W S R Wet-Sprinkler-Rain Example

  18. Problem 1: Creating the Joint Table

  19. Our Goal is to derive this table Let us observe that if I know 7 of these, the eight is obviously unique , as their sum = 1 So I need to guess or calculate or find 2n-1 = 7 values But the same data can be stored explicitely or implicitely, not necessarily in the form of a table!! What extra assumptions can help to create this table?

  20. Wet-Sprinkler-Rain Example

  21. Sprinkler on under condition that it rained You need to understand causation when you create the table Wet-Sprinkler-Rain Example Understanding of causation

  22. Independence simplifies probabilities We use independence of variables S and R P(S|R) = Sprinkler on under condition that it rained We can use these probabilities to create the table S and R are independent Wet-Sprinkler-Rain Example

  23. Wet-Sprinkler-Rain Example We create the CPT for S and R based on our knowledge of the problem Conditional Probability Table (CPT) Sprinkler was on It rained Grass is wet What about children playing or a dog pissing? It is still possible by this value 0.1 This first step shows the collected data

  24. Full joint for only S and R Independence of S and R is used 0.95 0.90 0.90 0.01 Use chain rule for probabilities Wet-Sprinkler-Rain Example

  25. Chain Rule for Probabilities Random variables 0.95 0.90 0.90 0.01

  26. Full joint probability • You have a table • You want to calculate some probability P(~W) Wet-Sprinkler-Rain Example

  27. Independence of S and R implies calculating fewer numbers to create the complete Joint Table for W, S and R Six numbers We reduced only from seven to six numbers Wet-Sprinkler-Rain Example

  28. Explanation of Diagrammatic Notations such as Bayes Networks You do not need to build the complete table!!

  29. You can build a graph of tables or nodes which correspond to certain types of tables

  30. Wet-Sprinkler-Rain Example

  31. Wet-Sprinkler-Rain Example Sprinkler was on It rained Grass is wet This first step shows the collected data Conditional Probability Table (CPT)

  32. Full joint probability • You have a table • You want to calculate some probability When you have this table you can modify it, you can also calculate everything!! P(~W)

  33. Problem 2: Calculating conditional probabilities from the Joint Distribution Table

  34. Wet-Sprinkler-Rain Example Probability that S=T and W=T = Probability that grass is wet under assumption that sprinkler was on Probability that S=T

  35. Wet-Sprinkler-Rain Example

  36. We showed examples of both causal inference and diagnostic inference We will use this in next slide Wet-Sprinkler-Rain Example

  37. “Explaining Away” the facts from the table Calculated earlier from this table < Wet-Sprinkler-Rain Example

  38. Conclusions on this problem • Table can be used for Explaining Away • Table can be used to calculate conditional independence. • Table can be used to calculate conditional probabilities • Table can be used to determine causality

  39. Problem 3: What if S and R are dependent? Calculating conditional independence

  40. Conditional Independence of S and R Wet-Sprinkler-Rain Example

  41. Diagrammatic notation for conditional Independence of two variables Wet-Sprinkler-Rain Example extended

  42. Conditional Independence formalized for sets of variables S3 S1 S2

  43. Now we will explain conditional independence CLOUDY - Wet-Sprinkler-Rain Example

  44. Example – Lung Cancer Diagnosis

  45. Example – Lung Cancer Diagnosis • A patient has been suffering from shortness of breath (called dyspnoea) and visits the doctor, worried that he has lung cancer. • The doctor knows that other diseases, such as tuberculosis and bronchitis are possible causes, as well as lung cancer. • She also knows that other relevant information includes whether or not the patient is a smoker (increasing the chances of cancer and bronchitis) and what sort of air pollution he has been exposed to. • A positive Xray would indicate either TB or lung cancer.

  46. Nodes and Values in Bayesian Networks Q: What are the nodes to represent and what values can they take? A: Nodes can be discrete or continuous • Boolean nodes – represent propositions taking binary valuesExample: Cancer node represents proposition “the patient has cancer” • Ordered valuesExample: Pollution node with values low, medium, high • Integral valuesExample: Age with possible values 1-120 Lung Cancer

  47. Lung Cancer Example: Nodes and Values Shortness of breath Example of variables as nodes in BN

  48. Lung Cancer Example: Bayesian Network Structure Pollution Smoker Cancer Xray Dyspnoea Lung Cancer

  49. Conditional Probability Tables (CPTs) in Bayesian Networks

More Related