Markov Random Fields & Conditional Random Fields

1 / 14

# Markov Random Fields & Conditional Random Fields - PowerPoint PPT Presentation

Markov Random Fields & Conditional Random Fields. John Winn MSR Cambridge. Road map. Markov Random Fields What they are Uses in vision/object recognition Advantages Difficulties Conditional Random Fields What they are Further difficulties.  12.  23. X 1. X 2. X 3.  234. X 4.

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.

## Markov Random Fields & Conditional Random Fields

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
1. Markov Random Fields & Conditional Random Fields John WinnMSR Cambridge

2. Road map • Markov Random Fields • What they are • Uses in vision/object recognition • Advantages • Difficulties • Conditional Random Fields • What they are • Further difficulties

3. 12 23 X1 X2 X3 234 X4 Markov Random Fields

4. Examples of use in vision • Grid-shaped MRFs for pixel labelling e.g. segmentation • MRFs (e.g. stars) over part positions for pictorial structures/constellation models.

5. Advantages • Probabilistic model: • Captures uncertainty • No ‘irreversible’ decisions • Iterative reasoning • Principled fusing of different cues • Undirected model • Allows ‘non-causal’ relationships (soft constraints) • Efficient algorithms:inference now practical for MRFs with millions variables – can be applied to raw pixels.

6. Maximum Likelihood Learning Sufficient statisticsof data Expected model sufficient statistics

7. Difficulty I: Inference • Exact inference intractable except in a few cases e.g. small models • Must resort to approximate methods • Loopy belief propagation • MCMC sampling • Alpha expansion (MAP solution only)

8. Difficulty II: Learning • Gradient descent – vulnerable to local minima • Slow – must perform expensive inference at each iteration. • Can stop inference early… • Contrastive divergence • Piecewise training + variants • Need fast + accurate methods

9. Difficulty III: Large cliques • For images, we want to look at patches not pairs of pixels. Therefore would like to use large cliques. • Cost of inference (memory and CPU) typically exponential in clique size. • Example: Field of Experts, Black + Roth • Training: contrastive divergenceover a week on a cluster of 50+ machines • Test: Gibbs samplingvery slow?

10. Other MRF issues… • Local minima when performing inference in high-dimensional latent spaces • MRF models often require making inaccurate independence assumptions about the observations.

11. Conditional Random Fields Lafferty et al., 2001 12 23 X1 X2 X3 234 I X4

12. Examples of use in vision • Grid-shaped CRFs for pixel labelling (e.g. segmentation), using boosted classifiers.

13. Difficulty IV: CRF Learning Sufficient statisticsof labels given the image Expected sufficient statistics given the image

14. Difficulty V: Scarcity of labels • CRF is a conditional model – needs labels. • Labels are expensive + increasingly hard to define. • Labels are also inherently lower dimensional than the data and hence support learning fewer parameters than generative models.