belief propagation on markov random fields l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Belief Propagation on Markov Random Fields PowerPoint Presentation
Download Presentation
Belief Propagation on Markov Random Fields

Loading in 2 Seconds...

play fullscreen
1 / 17

Belief Propagation on Markov Random Fields - PowerPoint PPT Presentation


  • 267 Views
  • Uploaded on

Belief Propagation on Markov Random Fields. Aggeliki Tsoli. Outline. Graphical Models Markov Random Fields (MRFs) Belief Propagation. Graphical Models. Diagrams Nodes : random variables Edges : statistical dependencies among random variables Advantages: Better visualization

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Belief Propagation on Markov Random Fields' - vondra


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
outline
Outline
  • Graphical Models
  • Markov Random Fields (MRFs)
  • Belief Propagation

MLRG

graphical models
Graphical Models
  • Diagrams
    • Nodes: random variables
    • Edges: statistical dependencies among random variables
  • Advantages:
    • Better visualization
      • conditional independence properties
      • new models design
    • Factorization

MLRG

graphical models types
Graphical Models types
  • Directed
    • causal relationships
    • e.g. Bayesian networks
  • Undirected
    • no constraints imposed on causality of events (“weak dependencies”)
    • Markov Random Fields (MRFs)

MLRG

example mrf application image denoising
Example MRF Application: Image Denoising

Noisy image

e.g. 10% of noise

Original image

(Binary)

  • Question: How can we retrieve the original image given the noisy one?

MLRG

mrf formulation
MRF formulation
  • Nodes
    • For each pixel i,
      • xi : latent variable (value in original image)
      • yi : observed variable (value in noisy image)
      • xi, yi {0,1}

y1

y2

x1

x2

yi

xi

yn

xn

MLRG

mrf formulation7

y1

y2

x1

x2

yi

xi

yn

xn

MRF formulation
  • Edges
    • xi,yi of each pixel i correlated
      • local evidence function (xi,yi)
      • E.g. (xi,yi) = 0.9 (if xi = yi) and (xi,yi) = 0.1 otherwise (10% noise)
    • Neighboring pixels, similar value
      • compatibility function (xi, xj)

MLRG

mrf formulation8

y1

y2

x1

x2

yi

xi

yn

xn

MRF formulation
  • Question: What are the marginal distributions for xi, i = 1, …,n?

P(x1, x2, …, xn) = (1/Z) (ij) (xi, xj) i (xi, yi)

MLRG

belief propagation
Belief Propagation
  • Goal: compute marginals of the latent nodes of underlying graphical model
  • Attributes:
    • iterative algorithm
    • message passing between neighboring latent variables nodes
  • Question: Can it also be applied to directed graphs?
  • Answer: Yes, but here we will apply it to MRFs

MLRG

belief propagation algorithm
Belief Propagation Algorithm
  • Select random neighboring latent nodes xi, xj
  • Send message mij from xi to xj
  • Update belief about marginal distribution at node xj
  • Go to step 1, until convergence
      • How is convergence defined?

yi

yj

xi

xj

mij

MLRG

step 2 message passing
Step 2: Message Passing
  • Message mij from xi to xj : what node xi thinks about the marginal distribution of xj

yi

yj

N(i)\j

xi

xj

mij(xj) = (xi) (xi, yi)(xi, xj)kN(i)\j mki(xi)

  • Messages initially uniformly distributed

MLRG

step 3 belief update
Step 3: Belief Update
  • Belief b(xj): what node xj thinks its marginal distribution is

N(j)

yj

xj

b(xj) = k (xj, yj)qN(j)mqj(xj)

MLRG

belief propagation algorithm13
Belief Propagation Algorithm
  • Select random neighboring latent nodes xi, xj
  • Send message mij from xi to xj
  • Update belief about marginal distribution at node xj
  • Go to step 1, until convergence

yi

yj

xi

xj

mij

MLRG

example
Example

- Compute belief at node 1.

3

m32

Fig. 12 (Yedidia et al.)

1

2

m21

m42

4

MLRG

does graph topology matter
Does graph topology matter?
  • BP procedure the same!
  • Performance
    • Failure to converge/predict accurate beliefs [Murphy, Weiss, Jordan 1999]
    • Success at
      • decoding for error-correcting codes [Frey and Mackay 1998]
      • computer vision problems where underlying MRF full of loops [Freeman, Pasztor, Carmichael 2000]

vs.

MLRG

how long does it take
How long does it take?
  • No explicit reference on paper
  • My opinion, depends on
    • nodes of graph
    • graph topology
  • Work on improving the running time of BP (for specific applications)
    • Next time?

MLRG