1 / 20

Cognitive Computer Vision

Cognitive Computer Vision. Kingsley Sage khs20@sussex.ac.uk and Hilary Buxton hilaryb@sussex.ac.uk Prepared under ECVision Specific Action 8-3 http://www.ecvision.org. Lecture 3. Graphical models Probabilistic graphical models Directed graphs Unidirected graphs Notation

briansmith
Download Presentation

Cognitive Computer Vision

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Cognitive Computer Vision Kingsley Sage khs20@sussex.ac.uk and Hilary Buxton hilaryb@sussex.ac.uk Prepared under ECVision Specific Action 8-3 http://www.ecvision.org

  2. Lecture 3 • Graphical models • Probabilistic graphical models • Directed graphs • Unidirected graphs • Notation • Rolling out over time

  3. What are graphical models? • Represent salient relationships graphically e.g.

  4. What are probabilistic graphical models? • A probabilistic graphical model is a type of probabilistic network that has roots in AI, statistics and neural networks • Provides a clean mathematical formalism that makes it possible to understand the relationships between a wide variety of network based approaches to computation • Allows to see different methods as instances of a broader probabilistic framework

  5. What are probabilistic graphical models? • Probabilistic graphical models use graphs to represent and manipulate joint probability distributions • Graphs can be directed – usually referred to as a belief network or Bayesian network • Graphs can be undirected – usually referred to as a Markov Random Field • A basis for algorithms for computation

  6. Joint probability – a reminder • A probability dependent on more than one variable e.g. p(AND|a,b): a b Discrete case of a logic AND gate A continuous case where light values are high p(AND|a,b)

  7. A B Directed graphs • Intuitively, the notion of causality (although this can be a philosophical argument) • A  B, so the value of A directly determines the value of B • P(A,B) = P(B|A).P(A)

  8. STOP OBSERVABLE HIDDEN hidden GET READY TO STOP GET READY TO GO observable GO Traffic lights model from lecture 2 as a directed graph

  9. Examples of directed graphs • Hidden Markov Models (later in the course) • Kalman filters • Factor analysis • Independent component analysis • Mixtures of Gaussians (later in the course) • Probabilistic expert systems • The list goes on …

  10. Joint probability – conditional independence • N variables are conditionally independent if one value does not depend on the other e.g: Here, A and B are conditionally independent: A B But A and C and B and C are not: C

  11. A B Undirected graphs • Intuitively, the notion of correlation (although this can be a philosophical argument) • A  B, so the values of A and B are interdependent • Directed graphs can be converted into undirected graphs (but beyond the scope of this course)

  12. A undirected graph for a computer vision task

  13. A B C Notation • Squares denote discrete nodes • Circles denote continuous valued nodes • Clear denotes hidden node • Shaded denotes observed node

  14. Rolling out over time • Probabilistic graphical model notation is very good at showing how models are propagated in time • Expose the dependencies between the different elements of the graphical structure

  15. STOP OBSERVABLE HIDDEN hidden hidden GET READY TO STOP GET READY TO GO observable observable GO Rolling out our traffic light example over 2 time steps … t=1 t=2

  16. t=1 t=2 hidden hidden observable observable Remember the concept of the temporal order of a model ? • In this model, the value of the hidden nodes (and thus the observable ones) at time t+1 only depends on the previous time step t • So this is a first order temporal model

  17. t=1 t=2 t=3 t=4 … hidden hidden hidden hidden observable observable observable observable … Remember the concept of the temporal order of a model ? A second order temporal model

  18. So why are graphical models relevant to Cognitive CV? • Precisely because they allows us to see different methods as instances of a broader probabilistic framework • These methods are the basis for our model of perception guided by expectation • We can put our model of expectation on a solid theoretical foundation • We can develop well-founded methods of learning rather than just being stuck with hand-coded models

  19. Summary • Probabilistic graphical models put the formalisms on a well-founded mathematical basis • We can distinguish directed and undirected graphs • Here we concentrate on directed graphs that we can roll out over time easily

  20. Next time … • A family of graphical models • A lot of excellent reference material can be found at: http://cosco.hiit.fi/Teaching/GraphicalModels/Fall2003/material.html

More Related