1 / 117

M. Pawan Kumar, Pushmeet Kohli Philip Torr

Solving Markov Random Fields using Dynamic Graph Cuts & Second Order Cone Programming Relaxations. M. Pawan Kumar, Pushmeet Kohli Philip Torr. Talk Outline. Dynamic Graph Cuts Fast reestimation of cut Useful for video Object specific segmentation Estimation of non submodular MRF’s

alanet
Download Presentation

M. Pawan Kumar, Pushmeet Kohli Philip Torr

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Solving Markov Random Fields using Dynamic Graph Cuts &Second Order Cone Programming Relaxations M. Pawan Kumar, Pushmeet Kohli Philip Torr

  2. Talk Outline • Dynamic Graph Cuts • Fast reestimation of cut • Useful for video • Object specific segmentation • Estimation of non submodular MRF’s • Relaxations beyond linear!!

  3. Example: Video Segmentation

  4. Model Based Segmentation Image Segmentation Pose Estimate [Images courtesy: M. Black, L. Sigal]

  5. Min-Marginals MAP Solution Belief - Foreground Image Low smoothness High smoothness 1 Colour Scale 0.5 Moderate smoothness 0

  6. Uses of Min marginals • Estimate of true marginals (uncertainty) • Parameter Learning. • Get best n solutions easily.

  7. solve SA differences between A and B PB* Simpler problem A and B similar SB Dynamic Graph Cuts PA cheaper operation PB computationally expensive operation

  8. Maximum flow MAP solution First segmentation problem Ga difference between Ga and Gb residual graph (Gr) second segmentation problem updated residual graph G` Gb Our Algorithm

  9. Computing the st-mincut from Max-flow algorithms Source (0) • The Max-flow Problem • Edge capacity and flow balance constraints 2 9 • Notation • Residual capacity • (edge capacity – current flow) • Augmenting path 1 a1 a2 2 5 4 • Simple Augmenting Path based Algorithms • Repeatedly find augmenting paths and push flow. • Saturated edges constitute the st-mincut. • [Ford-Fulkerson Theorem] Sink (1)

  10. Reparametrization Source (0) Key Observation 9 + α 2 Adding a constant to both the t-edges of a node does not change the edges constituting the st-mincut. 1 a1 a2 2 4 + α 5 Sink (1) E (a1,a2) = 2a1 + 5ā1+ 9a2 + 4ā2 + 2a1ā2 +ā1a2 E*(a1,a2 ) = E(a1,a2) + α(a2+ā2) = E(a1,a2) + α [a2+ā2 =1]

  11. Reparametrization, second type Source (0) Other type of reparametrization 9 + α 2 All reparametrizations of the graph are sums of these two types. 1 - α a1 a2 2 + α 5 + α 4 Sink (1) Both maintain the solution and add a constant α to the energy.

  12. Reparametrization • Nice result (easy to prove) • All other reparametrizations can be viewed in terms of these two basic operations. • Proof in Hammer, and also in one of Vlad’s recent papers.

  13. Graph Re-parameterization s flow/residual capacity 0/7 0/1 0/5 xi xj 0/9 0/2 0/4 t G original graph

  14. Graph Re-parameterization Edges cut s flow/residual capacity 5/2 1/0 0/7 0/1 Compute Maxflow 3/2 0/5 xi xj xi xj 0/12 0/9 st-mincut 2/0 4/0 0/2 0/4 t t Gr G residual graph original graph

  15. Update t-edgeCapacities s 5/2 1/0 3/2 xi xj 0/12 2/0 4/0 t Gr residual graph

  16. Update t-edgeCapacities s capacity changes from 7 to 4 5/2 1/0 3/2 xi xj 0/12 2/0 4/0 t Gr residual graph

  17. excess flow (e) = flow – new capacity = 5 – 4 = 1 Update t-edgeCapacities s capacity changes from 7 to 4 5/-1 1/0 3/2 xi xj edge capacity constraint violated! (flow > capacity) 0/12 2/0 4/0 t G` updated residual graph

  18. excess flow (e) = flow – new capacity = 5 – 4 = 1 add e to both t-edges connected to node i Update t-edgeCapacities s capacity changes from 7 to 4 5/-1 1/0 3/2 xi xj edge capacity constraint violated! (flow > capacity) 0/12 2/0 4/0 t G` updated residual graph

  19. Update t-edgeCapacities excess flow (e) = flow – new capacity s = 5 – 4 = 1 capacity changes from 7 to 4 5/0 1/0 add e to both t-edges connected to node i 3/2 xi xj edge capacity constraint violated! (flow > capacity) 0/12 2/1 4/0 t G` updated residual graph

  20. Update n-edgeCapacities s • Capacity changes from 5 to 2 5/2 1/0 3/2 xi xj 0/12 2/0 4/0 t residual graph Gr

  21. Update n-edgeCapacities s • Capacity changes from 5 to 2 • - edge capacity constraint violated! 5/2 1/0 3/-1 xi xj 0/12 2/0 4/0 t Updated residual graph G`

  22. Update n-edgeCapacities s • Capacity changes from 5 to 2 • - edge capacity constraint violated! • Reduce flow to satisfy constraint 5/2 1/0 3/-1 xi xj 0/12 2/0 4/0 t Updated residual graph G`

  23. Update n-edgeCapacities s • Capacity changes from 5 to 2 • - edge capacity constraint violated! • Reduce flow to satisfy constraint • causes flow imbalance! 1/0 5/2 2/0 excess xi xj 0/11 deficiency 2/0 4/0 t Updated residual graph G`

  24. Update n-edgeCapacities s • Capacity changes from 5 to 2 • - edge capacity constraint violated! • Reduce flow to satisfy constraint • causes flow imbalance! • Push excess flow to/from the terminals • Create capacity by adding α = excess to both t-edges. 1/0 5/2 2/0 excess xi xj 0/11 deficiency 2/0 4/0 t Updated residual graph G`

  25. Update n-edgeCapacities s • Capacity changes from 5 to 2 • - edge capacity constraint violated! • Reduce flow to satisfy constraint • causes flow imbalance! • Push excess flow to the terminals • Create capacity by adding α = excess to both t-edges. 5/3 2/0 2/0 xi xj 0/11 3/0 4/1 t Updated residual graph G`

  26. Update n-edgeCapacities s • Capacity changes from 5 to 2 • - edge capacity constraint violated! • Reduce flow to satisfy constraint • causes flow imbalance! • Push excess flow to the terminals • Create capacity by adding α = excess to both t-edges. 5/3 2/0 2/0 xi xj 0/11 3/0 4/1 t Updated residual graph G`

  27. Complexity analysis of MRF Update Operations *requires k edge update operations where k is degree of the node

  28. Improving the Algorithm • Finding augmenting paths is time consuming. • Dual-tree maxflow algorithm [Boykov & Kolmogorov PAMI 2004] • Reuses search trees after each augmentation. • Empirically shown to be substantially faster. • Our Idea • Reuse search trees from previous graph cut computation • Saves us search tree creation tree time [O(#edges)] • Search trees have to be modified to make them consistent with new graphs • Constrain the search of augmenting paths • New paths must contain at least one updated edge

  29. Reusing Search Trees c’ = measure of change in the energy • Running time • Dynamic algorithm (c’ + re-create search tree ) • Improved dynamic algorithm (c’) • Video Segmentation Example - Duplicate image frames (No time is needed)

  30. Dynamic Graph Cut vs Active Cuts • Our method flow recycling • AC cut recycling • Both methods: Tree recycling

  31. ExperimentalAnalysis Running time of the dynamic algorithm MRF consisting of 2x105 latent variables connected in a 4-neighborhood.

  32. Part II SOCP for MRF

  33. Aim • Accurate MAP estimation of pairwise Markov random fields 0 6 1 3 2 0 4 Label ‘1’ 1 2 4 1 1 3 Label ‘-1’ 1 0 5 0 3 7 2 V2 V3 V4 V1 Random Variables V = {V1,..,V4} Label Set L = {-1,1} Labelling m = {1, -1, -1, 1}

  34. Aim • Accurate MAP estimation of pairwise Markov random fields 0 6 1 3 2 0 4 Label ‘1’ 1 2 4 1 1 3 Label ‘-1’ 1 0 5 0 3 7 2 V2 V3 V4 V1 Cost(m) = 2

  35. Aim • Accurate MAP estimation of pairwise Markov random fields 0 6 1 3 2 0 4 Label ‘1’ 1 2 4 1 1 3 Label ‘-1’ 1 0 5 0 3 7 2 V2 V3 V4 V1 Cost(m) = 2 + 1

  36. Aim • Accurate MAP estimation of pairwise Markov random fields 0 6 1 3 2 0 4 Label ‘1’ 1 2 4 1 1 3 Label ‘-1’ 1 0 5 0 3 7 2 V2 V3 V4 V1 Cost(m) = 2 + 1 + 2

  37. Aim • Accurate MAP estimation of pairwise Markov random fields 0 6 1 3 2 0 4 Label ‘1’ 1 2 4 1 1 3 Label ‘-1’ 1 0 5 0 3 7 2 V2 V3 V4 V1 Cost(m) = 2 + 1 + 2 + 1

  38. Aim • Accurate MAP estimation of pairwise Markov random fields 0 6 1 3 2 0 4 Label ‘1’ 1 2 4 1 1 3 Label ‘-1’ 1 0 5 0 3 7 2 V2 V3 V4 V1 Cost(m) = 2 + 1 + 2 + 1 + 3

  39. Aim • Accurate MAP estimation of pairwise Markov random fields 0 6 1 3 2 0 4 Label ‘1’ 1 2 4 1 1 3 Label ‘-1’ 1 0 5 0 3 7 2 V2 V3 V4 V1 Cost(m) = 2 + 1 + 2 + 1 + 3 + 1

  40. Aim • Accurate MAP estimation of pairwise Markov random fields 0 6 1 3 2 0 4 Label ‘1’ 1 2 4 1 1 3 Label ‘-1’ 1 0 5 0 3 7 2 V2 V3 V4 V1 Cost(m) = 2 + 1 + 2 + 1 + 3 + 1 + 3

  41. Aim • Accurate MAP estimation of pairwise Markov random fields 0 6 1 3 2 0 4 Label ‘1’ 1 2 4 1 1 3 Label ‘-1’ 1 0 5 0 3 7 2 V2 V3 V4 V1 Cost(m) = 2 + 1 + 2 + 1 + 3 + 1 + 3 = 13 Pr(m)  exp(-Cost(m)) Minimum Cost Labelling = MAP estimate

  42. Aim • Accurate MAP estimation of pairwise Markov random fields 0 6 1 3 2 0 4 Label ‘1’ 1 2 4 1 1 3 Label ‘-1’ 1 0 5 0 3 7 2 V2 V3 V4 V1 Objectives • Applicable to all types of neighbourhood relationships • Applicable to all forms of pairwise costs • Guaranteed to converge (Convex approximation)

  43. D C B G1 A D D C V1 C B B A A V2 V3 MRF G2 Motivation Subgraph Matching - Torr - 2003, Schellewald et al - 2005 Unary costs are uniform

  44. G1 YES NO 2 1 G2 Potts Model Motivation Subgraph Matching - Torr - 2003, Schellewald et al - 2005 Pairwise Costs | d(mi,mj) - d(Vi,Vj) | < 

  45. D C B A D D C V1 C B B A A V2 V3 MRF Motivation Subgraph Matching - Torr - 2003, Schellewald et al - 2005

  46. Motivation Subgraph Matching - Torr - 2003, Schellewald et al - 2005 D C B A D D C V1 C B B A A V2 V3 MRF

  47. P2 (x,y,,) P1 P3 MRF Image Motivation Matching Pictorial Structures - Felzenszwalb et al - 2001 Outline Texture Part likelihood Spatial Prior

  48. YES NO 2 1 P2 (x,y,,) P1 P3 MRF Image Motivation Matching Pictorial Structures - Felzenszwalb et al - 2001 • Unary potentials are negative log likelihoods Valid pairwise configuration Potts Model

  49. YES NO 2 1 Motivation Matching Pictorial Structures - Felzenszwalb et al - 2001 • Unary potentials are negative log likelihoods Valid pairwise configuration Potts Model P2 (x,y,,) P1 P3 Image Pr(Cow)

  50. Outline • Integer Programming Formulation • Previous Work • Our Approach • Second Order Cone Programming (SOCP) • SOCP Relaxation • Robust Truncated Model • Applications • Subgraph Matching • Pictorial Structures

More Related