1 / 37

Markov Random Fields ( MRF)

Markov Random Fields ( MRF). Presenter : Kuang-Jui Hsu Date : 2011/5/23(Tues.). Outline. Introduction Conditional Independence Properties Factorization Properties Illustration: Image De-noising Relation to Directed Graphs. Introduction. Based on a undirected graph

tyne
Download Presentation

Markov Random Fields ( MRF)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Markov Random Fields (MRF) Presenter:Kuang-Jui Hsu Date :2011/5/23(Tues.)

  2. Outline • Introduction • Conditional Independence Properties • Factorization Properties • Illustration: Image De-noising • Relation to Directed Graphs

  3. Introduction • Based on a undirected graph • The MRF model has a simple form and is easy to use • Based on conditional independence properties

  4. Conditional Independence Properties • In an undirected graph, there are three sets of nodes, • denoted A, B, C, and A is conditionally independent • of B given C • Shorthand notation: <=> p(A|B, C) = p(A|C) Conditional independence property

  5. Testing Methods in a Graph

  6. Testing Methods in a Graph

  7. Testing Methods in a Graph

  8. Testing Methods in a Graph

  9. Testing Methods in a Graph

  10. Testing Methods in a Graph

  11. Testing Methods in a Graph

  12. Simple Form • A node will be conditionally independent of all other • nodes conditioned only on neighbouring nodes

  13. Factorization Properties • In a directed graph • Generalized form:

  14. In an Undirected Graph • Consider two nodes and that are not • connected. • Must be conditionally independent • So, the conditional independence property can be • expressed as The set x of all variables with and removed Factorization Property

  15. Clique • This leads us to consider a graphical concept: • Clique Clique: Maximal Clique:

  16. Potential Function • Define the factors in the potential function by using the clique • Generally, consider the maximal cliques, because other cliques must be the subsets of maximal cliques

  17. Potential Function • Potential function over the maximal cliques of the graph Clique The set of variables in that clique • The joint distribution: Equal to zero or positive • Partition function: a normalization constant

  18. Partition Function • A model with M discrete nodes each having K states, • then the evaluation involves summing over states • The normalization constant is the major limitations • Needed for parameter learning • Exponential growth • Because it will be a function of any parameters that govern the potential functions

  19. Connection between Conditional Independence And Factorization • Define : • For any node , the following conditional property holds • The neighborhood of • All nodes expect • Define : • A distribution can be expressed as • The Hammerley-Clifford theorem states that the sets • and identical.

  20. Potential Function Expression • It is convenient to express them as exponentials • Restrict the potential function to be positive • Energy function • Boltzmann distribution • The total energy is obtained by adding the energies of • each of the maximal energy

  21. Illustration: Image De-noising • Noisy image • Described by an array of binary pixel values • , where the index i = 1, . . ., D runs over • all pixels.

  22. Illustration: Image De-noising • Noise-free image • Described by an array of binary pixel values • , and randomly flipping the sign of pixels with some small probability

  23. Create the MRF Model • A strong correlation between the neighbouringpixles • A strong correlation between and • MRF model: • The graph has two types of cliques, • each of which contain two variables. • The clique form , uses the form of the energy function • The clique form , uses the form of the energy function • The parameters and are positive, • and are neighbour

  24. The Energy Function • The complete energy function: • 1.postitve • 2.negative • The joint distribution

  25. Solve by ICM • For the purpose of image restoration, find an image x having a high probability • Use a simple iterative technique called iterated condition mode ( ICM) • Simply an application of coordinate-wise gradient ascent

  26. The steps of ICM Evaluate the total energy for -1 and 1 choose the lower energy, and update • Stop until convergence

  27. Result Use ICM Use graph-cut

  28. Relation to Directed Graphs • Solve the problem of taking a model that is specified using a directed graph and trying to convert it to undirected graph Directed graph Undirected graph

  29. Relation to Directed Gaphs This is easily done by identifying

  30. Relation to Directed Graphs • Consider how to generalize this construction • This can be achieved if the clique potentials of the undirected graph are given by the conditional distributions of the directed graph. • Ensure that the set of variables that appears in each of • conditional distributions is a member of at least one • clique of the undirected graph

  31. Generalize This Construction • For nodes having one parent

  32. Convert the Directed Graph to the Undirected Graph • For nodes having more than one parent • Moral graph • Involving the four variables, so they must belong to a • single clique if this conditional distribution is to be • absorbed in a clique potential • The process has become known as moralization

  33. Convert the Directed Graph to the Undirected Graph • Discard some conditional independence properties • In fact,we can simply using a fully connected undirected graph • However, this would discard all conditional properties • The moralization adds the fewest extra links and so • retain the maximum number of independence properties

  34. Special Graph • There are two type of graph that can express different • conditional independence properties • Type 1: dependence map(D-map) • Type 2: Independence map(I-map)

  35. Dependence Map(D-Map) • Every conditional independence statement satisfied by • the distribution is reflected in the graph • A completely disconnected graph

  36. Independence Map(I-Map) • Every conditional independence statement implied by • a graph is satisfied by a specific distribution • A full connected graph • Aperfect map: bothI-map and D-map

  37. Perfect Map • Undirected • graph • Directed • graph • The set of all distributions P over a given set of variables

More Related