1 / 31

Introduction to probability theory and graphical models

Introduction to probability theory and graphical models. Translational Neuroimaging Seminar on Bayesian Inference Spring 2013. Jakob Heinzle Translational Neuromodeling Unit (TNU) Institute for Biomedical Engineering (IBT) University and ETH Zürich. Literature and References.

gary
Download Presentation

Introduction to probability theory and graphical models

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introductiontoprobabilitytheoryandgraphicalmodels Translational Neuroimaging Seminar on BayesianInference Spring 2013 Jakob Heinzle Translational Neuromodeling Unit (TNU) Institute for Biomedical Engineering (IBT) University and ETH Zürich

  2. Literatureand References • Literature: • Bishop (Chapters 1.2, 1.3, 8.1, 8.2) • MacKay (Chapter 2) • Barber (Chapters 1, 2, 3, 4) • Manyimages in thislecturearetakenfromtheabovereferences. Bayesian Inference - Introduction to probability theory

  3. Probabilitydistribution A probability P(x=true) isdefined on a sample space (domain) anddefines (foreverypossible) event in the sample spacethecertaintyofittooccur. Sample space: dom(X)={0,1} , Probabilitiessumtoone. Bishop, Fig. 1.11 Bayesian Inference - Introduction to probability theory

  4. Probabilitytheory: Basic rules . Sumrule* - P(X) is also calledthe marginal distribution Productrule - * Accordingto Bishop Bayesian Inference - Introduction to probability theory

  5. Conditionaland marginal probability Bayesian Inference - Introduction to probability theory

  6. Conditionaland marginal probability Bishop, Fig. 1.11 Bayesian Inference - Introduction to probability theory

  7. Independent variables Questionforlater: WhatdoesthismeanforBayes? Bayesian Inference - Introduction to probability theory

  8. Probabilitytheory: Bayes’ theorem isderivedfromtheproductrule Bayesian Inference - Introduction to probability theory

  9. RephrasingandnamingofBayes’ rule D: data, q: parameters, H: hypothesisweputintothemodel. MacKay Bayesian Inference - Introduction to probability theory

  10. Example: Bishop Fig. 1.9 Box (B): blue (b) orred (r) Fruit (F): apple (a) or orange (o) p(B=r) = 0.4, p(B=b) = 0.6. Whatistheprobabilityofhaving a red box ifonehasdrawn an orange? Bishop, Fig. 1.9 Bayesian Inference - Introduction to probability theory

  11. Probabilitydensity Bayesian Inference - Introduction to probability theory

  12. PDF and CDF Bishop, Fig. 1.12 Bayesian Inference - Introduction to probability theory

  13. Cumulativedistribution Short example: Howtousethecumulativedistribution totransform a uniform distribution! Bayesian Inference - Introduction to probability theory

  14. Marginal densities p Integration insteadofsumming Bayesian Inference - Introduction to probability theory

  15. Twoviewson probability • Probabilitycan … • … describethefrequencyofoutcomes in randomexperiments classicalinterpretation. • … describethedegreeof belief about a particularevent Bayesianviewpointorsubjectiveinterpretationofprobability. MacKay, Chapter 2 Bayesian Inference - Introduction to probability theory

  16. Expectationof a function Or Bayesian Inference - Introduction to probability theory

  17. Graphicalmodels They provide a simple way to visualize the structure of a probabilistic model and can be used to design and motivate new models. Insights into the properties of the model, including conditional independence properties, can be obtained by inspection of the graph. Complex computations, required to perform inference and learning in sophisticated models, can be expressed in terms of graphical manipulations, in which underlying mathematical expressions are carried along implicitly. Bishop, Chap. 8 Bayesian Inference - Introduction to probability theory

  18. Graphicalmodelsoverview Directed Graph Undirected Graph Names: nodes (vertices), edges (links), paths, cycles, loops, neighbours Forsummaryofdefinitionssee Barber, Chapter 2 Bayesian Inference - Introduction to probability theory

  19. Graphicalmodelsoverview Barber, Introduction Bayesian Inference - Introduction to probability theory

  20. Graphicalmodels Bishop, Fig. 8.1 Bayesian Inference - Introduction to probability theory

  21. Graphicalmodels: parentsandchildren Node a is a parentofnode b, node b is a childofnode a. Bishop, Fig. 8.1 Bayesian Inference - Introduction to probability theory

  22. Belief networks = Bayesian belief networks = BayesianNetworks In general: • Every probabilitydistribution • canbeexpressedas a • Directedacyclicgraph (DAG) • Important: Nodirectedcycles! Bishop, Fig. 8.2 Bayesian Inference - Introduction to probability theory

  23. Conditionalindependence A variable aisconditionallyindependentofbgivenc, if In bayesiannetworksconditionalindependencecanbe testedbyfollowingsome simple rules Bayesian Inference - Introduction to probability theory

  24. Conditionalindependence – tail-to-tailpath Is a independentof b? No! Yes! Bishop, Chapter 8.2 Bayesian Inference - Introduction to probability theory

  25. Conditionalindependence – head-to-tailpath Is a independentof b? No! Yes! Bishop, Chapter 8.2 Bayesian Inference - Introduction to probability theory

  26. Conditionalindependence – head-to-headpath Is a independentof b? Yes! No! Bishop, Chapter 8.2 Bayesian Inference - Introduction to probability theory

  27. Conditionalindependence – notation Bishop, Chapter 8.2 Bayesian Inference - Introduction to probability theory

  28. Conditionalindependence – threebasicstructures Bishop, Chapter 8.2.2 Bayesian Inference - Introduction to probability theory

  29. More conventions in graphicalnotations Regression model Short form Parameters explicit = = Bishop, Chapter 8 Bayesian Inference - Introduction to probability theory

  30. More conventions in graphicalnotations Completemodelused forprediction Trained on datatn  Bishop, Chapter 8 Bayesian Inference - Introduction to probability theory

  31. Summary – thingstoremember • ProbabilitiesandhowtocomputewiththeProductrule, Bayes’ Rule, Sumrule • Probabilitydensities PDF, CDF • Conditionaland Marginal distributions • Basic conceptsofgraphicalmodels  Directed vs. Undirected, nodesandedges, parentsandchildren. • Conditionalindependence in graphsandhowto check it. Bishop, Chapter 8.2.2 Bayesian Inference - Introduction to probability theory

More Related