1 / 90

Solving Markov Random Fields using Second Order Cone Programming Relaxations

Solving Markov Random Fields using Second Order Cone Programming Relaxations. M. Pawan Kumar Philip Torr Andrew Zisserman. Aim. Accurate MAP estimation of pairwise Markov random fields. 0. 6. 1. 3. 2. 0. 4. Label ‘1’. 1. 2. 4. 1. 1. 3. Label ‘0’. 1. 0. 5. 0. 3. 7. 2.

erelah
Download Presentation

Solving Markov Random Fields using Second Order Cone Programming Relaxations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Solving Markov Random Fields using Second Order Cone Programming Relaxations M. Pawan Kumar Philip Torr Andrew Zisserman

  2. Aim • Accurate MAP estimation of pairwise Markov random fields 0 6 1 3 2 0 4 Label ‘1’ 1 2 4 1 1 3 Label ‘0’ 1 0 5 0 3 7 2 V2 V3 V4 V1 Random Variables V = {V1,..,V4} Label Set L = {0,1} Labelling m = {1, 0, 0, 1}

  3. Aim • Accurate MAP estimation of pairwise Markov random fields 0 6 1 3 2 0 4 Label ‘1’ 1 2 4 1 1 3 Label ‘0’ 1 0 5 0 3 7 2 V2 V3 V4 V1 Cost(m) = 2

  4. Aim • Accurate MAP estimation of pairwise Markov random fields 0 6 1 3 2 0 4 Label ‘1’ 1 2 4 1 1 3 Label ‘0’ 1 0 5 0 3 7 2 V2 V3 V4 V1 Cost(m) = 2 + 1

  5. Aim • Accurate MAP estimation of pairwise Markov random fields 0 6 1 3 2 0 4 Label ‘1’ 1 2 4 1 1 3 Label ‘0’ 1 0 5 0 3 7 2 V2 V3 V4 V1 Cost(m) = 2 + 1 + 2

  6. Aim • Accurate MAP estimation of pairwise Markov random fields 0 6 1 3 2 0 4 Label ‘1’ 1 2 4 1 1 3 Label ‘0’ 1 0 5 0 3 7 2 V2 V3 V4 V1 Cost(m) = 2 + 1 + 2 + 1

  7. Aim • Accurate MAP estimation of pairwise Markov random fields 0 6 1 3 2 0 4 Label ‘1’ 1 2 4 1 1 3 Label ‘0’ 1 0 5 0 3 7 2 V2 V3 V4 V1 Cost(m) = 2 + 1 + 2 + 1 + 3

  8. Aim • Accurate MAP estimation of pairwise Markov random fields 0 6 1 3 2 0 4 Label ‘1’ 1 2 4 1 1 3 Label ‘0’ 1 0 5 0 3 7 2 V2 V3 V4 V1 Cost(m) = 2 + 1 + 2 + 1 + 3 + 1

  9. Aim • Accurate MAP estimation of pairwise Markov random fields 0 6 1 3 2 0 4 Label ‘1’ 1 2 4 1 1 3 Label ‘0’ 1 0 5 0 3 7 2 V2 V3 V4 V1 Cost(m) = 2 + 1 + 2 + 1 + 3 + 1 + 3

  10. Aim • Accurate MAP estimation of pairwise Markov random fields 0 6 1 3 2 0 4 Label ‘1’ 1 2 4 1 1 3 Label ‘0’ 1 0 5 0 3 7 2 V2 V3 V4 V1 Cost(m) = 2 + 1 + 2 + 1 + 3 + 1 + 3 = 13 Pr(m)  exp(-Cost(m)) Minimum Cost Labelling = MAP estimate

  11. Aim • Accurate MAP estimation of pairwise Markov random fields 0 6 1 3 2 0 4 Label ‘1’ 1 2 4 1 1 3 Label ‘0’ 1 0 5 0 3 7 2 V2 V3 V4 V1 Objectives • Applicable for all neighbourhood relationships • Applicable for all forms of pairwise costs • Guaranteed to converge

  12. D C B G1 A D D C V1 C B B A A V2 V3 MRF G2 Motivation Subgraph Matching - Torr - 2003, Schellewald et al - 2005 Unary costs are uniform

  13. G1 YES NO 2 1 G2 Potts Model Motivation Subgraph Matching - Torr - 2003, Schellewald et al - 2005 Pairwise Costs | d(mi,mj) - d(Vi,Vj) | < 

  14. D C B A D D C V1 C B B A A V2 V3 MRF Motivation Subgraph Matching - Torr - 2003, Schellewald et al - 2005

  15. Motivation Subgraph Matching - Torr - 2003, Schellewald et al - 2005 D C B A D D C V1 C B B A A V2 V3 MRF

  16. P2 (x,y,,) P1 P3 MRF Image Motivation Matching Pictorial Structures - Felzenszwalb et al - 2001 Outline Texture Part likelihood Spatial Prior

  17. YES NO 2 1 P2 (x,y,,) P1 P3 MRF Image Motivation Matching Pictorial Structures - Felzenszwalb et al - 2001 • Unary potentials are negative log likelihoods Valid pairwise configuration Potts Model

  18. YES NO 2 1 Motivation Matching Pictorial Structures - Felzenszwalb et al - 2001 • Unary potentials are negative log likelihoods Valid pairwise configuration Potts Model P2 (x,y,,) P1 P3 Image Pr(Cow)

  19. Outline • Integer Programming Formulation • Previous Work • Our Approach • Second Order Cone Programming (SOCP) • SOCP Relaxation • Robust Truncated Model • Applications • Subgraph Matching • Pictorial Structures

  20. Cost of V1 = 1 Cost of V1 = 0 Integer Programming Formulation 2 0 4 Unary Cost Label ‘1’ 1 3 Label ‘0’ 5 0 2 V2 V1 Labelling m = {1 , 0} ; 2 4 ] 2 Unary Cost Vector u = [ 5

  21. V1= 1 V1 0 Integer Programming Formulation 2 0 4 Unary Cost Label ‘1’ 1 3 Label ‘0’ 5 0 2 V2 V1 Labelling m = {1 , 0} ; 2 4 ]T 2 Unary Cost Vector u = [ 5 Label vector x = [ -1 1 ; 1 -1 ]T Recall that the aim is to find the optimal x

  22. Integer Programming Formulation 2 0 4 Unary Cost Label ‘1’ 1 3 Label ‘0’ 5 0 2 V2 V1 Labelling m = {1 , 0} ; 2 4 ]T 2 Unary Cost Vector u = [ 5 Label vector x = [ -1 1 ; 1 -1 ]T 1 Sum of Unary Costs = ∑iui (1 + xi) 2

  23. Pairwise Cost Matrix P Cost of V1 = 0 and V1 = 0 0 Cost of V1 = 0 and V2 = 0 0 0 1 0 Cost of V1 = 0 and V2 = 1 0 1 0 0 3 0 0 0 Integer Programming Formulation 2 0 4 Pairwise Cost Label ‘1’ 1 3 Label ‘0’ 5 0 2 V2 V1 Labelling m = {1 , 0} 0 3 0

  24. Pairwise Cost Matrix P 0 0 0 1 0 0 1 0 0 3 0 0 0 Integer Programming Formulation 2 0 4 Pairwise Cost Label ‘1’ 1 3 Label ‘0’ 5 0 2 V2 V1 Labelling m = {1 , 0} Sum of Pairwise Costs 1 ∑ijPij (1 + xi)(1+xj) 0 3 0 4

  25. Pairwise Cost Matrix P 0 0 0 1 0 1 = ∑ijPij (1 + xi + xj + Xij) 4 0 1 0 0 3 0 0 0 Integer Programming Formulation 2 0 4 Pairwise Cost Label ‘1’ 1 3 Label ‘0’ 5 0 2 V2 V1 Labelling m = {1 , 0} Sum of Pairwise Costs 1 ∑ijPij (1 + xi +xj + xixj) 0 3 0 4 X = x xT Xij = xi xj

  26. Each variable should be assigned a unique label ∑ xi = 2 - |L| i  Va • Marginalization constraint ∑ Xij = (2 - |L|) xi j  Vb Integer Programming Formulation Constraints

  27. ∑ xi = 2 - |L| i  Va ∑ Xij = (2 - |L|) xi Convex Non-Convex j  Vb Integer Programming Formulation Chekuri et al. , SODA 2001 1 1 ∑ Pij (1 + xi + xj + Xij) x* = argmin + ∑ ui (1 + xi) 4 2 xi{-1,1} X = x xT

  28. Outline • Integer Programming Formulation • Previous Work • Our Approach • Second Order Cone Programming (SOCP) • SOCP Relaxation • Robust Truncated Model • Applications • Subgraph Matching • Pictorial Structures

  29. Retain Convex Part ∑ xi = 2 - |L| i  Va ∑ Xij = (2 - |L|) xi Relax Non-convex Constraint j  Vb Linear Programming Formulation Chekuri et al. , SODA 2001 1 1 ∑ Pij (1 + xi + xj + Xij) x* = argmin + ∑ ui (1 + xi) 4 2 xi{-1,1} X = x xT

  30. Retain Convex Part ∑ xi = 2 - |L| i  Va ∑ Xij = (2 - |L|) xi Relax Non-convex Constraint j  Vb Linear Programming Formulation Chekuri et al. , SODA 2001 1 1 ∑ Pij (1 + xi + xj + Xij) x* = argmin + ∑ ui (1 + xi) 4 2 xi[-1,1] X = x xT

  31. Retain Convex Part ∑ xi = 2 - |L| i  Va ∑ Xij = (2 - |L|) xi j  Vb Linear Programming Formulation Chekuri et al. , SODA 2001 1 1 ∑ Pij (1 + xi + xj + Xij) x* = argmin + ∑ ui (1 + xi) 4 2 xi[-1,1]

  32. Linear Programming Formulation x {-1,1}, X = x2 Feasible Region (IP)

  33. Linear Programming Formulation x {-1,1}, X = x2 Feasible Region (IP) Feasible Region (Relaxation 1) x [-1,1], X = x2

  34. Linear Programming Formulation x {-1,1}, X = x2 Feasible Region (IP) Feasible Region (Relaxation 1) x [-1,1], X = x2 Feasible Region (Relaxation 2) x [-1,1]

  35. Linear Programming Formulation • Bounded algorithms proposed by Chekuri et al, SODA 2001 • -expansion - Komodakis and Tziritas, ICCV 2005 • TRW - Wainwright et al., NIPS 2002 • TRW-S - Kolmogorov, AISTATS 2005 • Efficient because it uses Linear Programming • Not accurate

  36. Retain Convex Part ∑ xi = 2 - |L| i  Va ∑ Xij = (2 - |L|) xi Relax Non-convex Constraint j  Vb Semidefinite Programming Formulation Lovasz and Schrijver, SIAM Optimization, 1990 1 1 ∑ Pij (1 + xi + xj + Xij) x* = argmin + ∑ ui (1 + xi) 4 2 xi{-1,1} X = x xT

  37. Retain Convex Part ∑ xi = 2 - |L| i  Va ∑ Xij = (2 - |L|) xi Relax Non-convex Constraint j  Vb Semidefinite Programming Formulation Lovasz and Schrijver, SIAM Optimization, 1990 1 1 ∑ Pij (1 + xi + xj + Xij) x* = argmin + ∑ ui (1 + xi) 4 2 xi[-1,1] X = x xT

  38. Semidefinite Programming Formulation 1 xT = x X Convex Non-Convex . . . 1 1 x1 x2 xn x1 x2 . . . xn Xii = 1 Positive Semidefinite Rank = 1

  39. Semidefinite Programming Formulation 1 xT = x X Convex . . . 1 1 x1 x2 xn x1 x2 . . . xn Xii = 1 Positive Semidefinite

  40. 0 I 0 A 0 I A-1B = BTA-1 0 0 I I C - BTA-1B A 0 C -BTA-1B 0 Schur’s Complement A B BT C

  41. Semidefinite Programming Formulation 1 0 1 0 I xT = x 0 0 I X - xxT 1 Schur’s Complement X - xxT 0 1 xT x X

  42. Retain Convex Part ∑ xi = 2 - |L| i  Va ∑ Xij = (2 - |L|) xi Relax Non-convex Constraint j  Vb Semidefinite Programming Formulation Lovasz and Schrijver, SIAM Optimization, 1990 1 1 ∑ Pij (1 + xi + xj + Xij) x* = argmin + ∑ ui (1 + xi) 4 2 xi[-1,1] X = x xT

  43. Retain Convex Part ∑ xi = 2 - |L| i  Va ∑ Xij = (2 - |L|) xi j  Vb X - xxT 0 Semidefinite Programming Formulation Lovasz and Schrijver, SIAM Optimization, 1990 1 1 ∑ Pij (1 + xi + xj + Xij) x* = argmin + ∑ ui (1 + xi) 4 2 xi[-1,1] Xii = 1

  44. Semidefinite Programming Formulation x {-1,1}, X = x2 Feasible Region (IP)

  45. Semidefinite Programming Formulation x {-1,1}, X = x2 Feasible Region (IP) Feasible Region (Relaxation 1) x [-1,1], X = x2

  46. Semidefinite Programming Formulation x {-1,1}, X = x2 Feasible Region (IP) Feasible Region (Relaxation 1) x [-1,1], X = x2 Feasible Region (Relaxation 2) x [-1,1], X  x2

  47. Semidefinite Programming Formulation • Formulated by Lovasz and Schrijver, 1990 • Finds a full X matrix • Max-cut - Goemans and Williamson, JACM 1995 • Max-k-cut - de Klerk et al, 2000 • Accurate • Not efficient because of Semidefinite Programming

  48. Previous Work - Overview Is there a Middle Path ???

  49. Outline • Integer Programming Formulation • Previous Work • Our Approach • Second Order Cone Programming (SOCP) • SOCP Relaxation • Robust Truncated Model • Applications • Subgraph Matching • Pictorial Structures

  50. x2 + y2 z2 Second Order Cone Programming Second Order Cone || v ||  t OR || v ||2  st

More Related