1 / 65

Discrete Optimization Lecture 4 – Part 1

Discrete Optimization Lecture 4 – Part 1. M. Pawan Kumar pawan.kumar@ecp.fr. Slides available online http:// cvn.ecp.fr /personnel/ pawan /. Metric Labeling. Variables V = { V 1 , V 2 , …, V n }. Metric Labeling. Variables V = { V 1 , V 2 , …, V n }. Metric Labeling.

ivory
Download Presentation

Discrete Optimization Lecture 4 – Part 1

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Discrete OptimizationLecture 4 – Part 1 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online http://cvn.ecp.fr/personnel/pawan/

  2. Metric Labeling Variables V= { V1, V2, …, Vn}

  3. Metric Labeling Variables V= { V1, V2, …, Vn}

  4. Metric Labeling wabd(f(a),f(b)) θb(f(b)) wab ≥ 0 θa(f(a)) d is metric Va Vb minf E(f) + Σ(a,b)wabd(f(a),f(b)) = Σaθa(f(a)) Labels L= { l1, l2, …, lh} Variables V= { V1, V2, …, Vn} Labeling f: { 1, 2, …, n}  {1, 2, …, h}

  5. Metric Labeling Va Vb minf E(f) + Σ(a,b)wabd(f(a),f(b)) = Σaθa(f(a)) NP hard Low-level vision applications

  6. Outline • Approximate Algorithms • Comparison • Rounding-based Moves

  7. Boykov, Veksler and Zabih Efficiency Move-Making Algorithms Kleinberg and Tardos Accuracy Convex Relaxations

  8. Kolmogorov and Boykov Efficiency Move-Making Algorithms Chekuri, Khanna, Naor and Zosin Accuracy Convex Relaxations

  9. Outline • Approximate Algorithms • Move-Making Algorithms • Linear Programming Relaxation • Comparison • Rounding-based Moves

  10. Move-Making Algorithms Space of All Labelings f

  11. Expansion Algorithm Variables take label lα or retain current label Boykov, Veksler and Zabih, 2001 Slide courtesy PushmeetKohli

  12. Expansion Algorithm Variables take label lα or retain current label Tree Ground House Status: Initialize with Tree Expand Ground Expand House Expand Sky Sky Boykov, Veksler and Zabih, 2001 Slide courtesy PushmeetKohli

  13. Multiplicative Bounds f*: Optimal Labeling f: Estimated Labeling Σaθa(f(a)) + Σ(a,b)wabd(f(a),f(b)) ≥ Σaθa(f*(a)) + Σ(a,b)wabd(f*(a),f*(b))

  14. Multiplicative Bounds f*: Optimal Labeling f: Estimated Labeling Σaθa(f(a)) + Σ(a,b)wabd(f(a),f(b)) ≤ B Σaθa(f*(a)) + Σ(a,b)wabd(f*(a),f*(b)) Ask me the obvious question

  15. Outline • Approximate Algorithms • Move-Making Algorithms • Linear Programming Relaxation • Comparison • Rounding-based Moves

  16. Integer Linear Program Minimize a linear function over a set of feasible solutions Indicator xa(i)  {0,1} for each variable Va and label li Indicator xab(i,k)  {0,1} for each neighbor (Va,Vb) and labels li, lk Number of facets grows exponentially in problem size

  17. Linear Programming Relaxation Indicator xa(i)  {0,1} for each variable Va and label li Indicator xab(i,k)  {0,1} for each neighbor (Va,Vb) and labels li, lk Schlesinger, 1976; Chekuri et al., 2001; Wainwright et al., 2003

  18. Linear Programming Relaxation Indicator xa(i)  [0,1] for each variable Va and label li Indicator xab(i,k)  [0,1] for each neighbor (Va,Vb) and labels li, lk Schlesinger, 1976; Chekuri et al., 2001; Wainwright et al., 2003

  19. Approximation Factor x*: LP Optimal Solution x: Estimated Integral Solution ΣaΣiθa(i)xa(i) + Σ(a,b)Σ(i,k) wabd(i,k)xab(i,k) ≥ ΣaΣiθa(i)x*a(i) + Σ(a,b)Σ(i,k) wabd(i,k)x*ab(i,k)

  20. Approximation Factor x*: LP Optimal Solution x: Estimated Integral Solution ΣaΣiθa(i)xa(i) + Σ(a,b)Σ(i,k) wabd(i,k)xab(i,k) ≤ F ΣaΣiθa(i)x*a(i) + Σ(a,b)Σ(i,k) wabd(i,k)x*ab(i,k)

  21. Outline • Approximate Algorithms • Comparison • Rounding-based Moves

  22. Theoretical Guarantees M = ratio of maximum and minimum non-zero distance

  23. Outline • Approximate Algorithms • Comparison • Rounding-based Moves • Complete Rounding • Interval Rounding • Hierarchical Rounding

  24. Complete Rounding Treat xa(i)  [0,1] as probability that f(a) = i Cumulative probability ya(i) = Σj≤ixa(j) r ya(2) ya(i) ya(k) 0 ya(1) ya(h) = 1 Generate a random number r  (0,1] Assign the label next to r

  25. Example 0.25 0.5 0.75 1.0 r ya(2) ya(3) 0 ya(1) ya(4) 0.7 0.8 0.9 1.0 r yb(1) yb(3) 0 yb(4) yb(2) 0.2 0.3 0.1 1.0 r 0 yc(3) yc(2) yc(4) yc(1)

  26. Complete Move A move that mimics complete rounding Considers all random variables and labels Assigns labels in one iteration

  27. Key Observation If d is submodular d(i,k) + d(i+1,k+1) ≤ d(i,k+1) + d(i+1,k), for all i, k energy can be minimized via minimum cut Schlesinger and Flach, 2003

  28. Complete Move Va Vb θab(i,k) = wabd(i,k) NP-hard

  29. Complete Move Va Vb d’(i,k) ≥ d(i,k) d’ is submodular θab(i,k) = wabd’(i,k)

  30. Complete Move Va Vb d’(i,k) ≥ d(i,k) d’ is submodular θab(i,k) = wabd’(i,k)

  31. Complete Move New problem can be solved using minimum cut Same multiplicative bound as complete rounding Multiplicative bound is tight

  32. Outline • Approximate Algorithms • Comparison • Rounding-based Moves • Complete Rounding • Interval Rounding • Hierarchical Rounding

  33. Interval Rounding Treat xa(i)  [0,1] as probability that f(a) = i Cumulative probability ya(i) = Σj≤ixa(j) ya(2) ya(i) ya(k) 0 ya(1) ya(h) = 1 Choose an interval of length h’

  34. Interval Rounding Treat xa(i)  [0,1] as probability that f(a) = i Cumulative probability ya(i) = Σj≤ixa(j) r 0 ya(k)-ya(i) REPEAT Choose an interval of length h’ Generate a random number r  (0,1] Assign the label next to r if it is within the interval

  35. Example 0.25 0.5 0.75 1.0 ya(2) ya(3) 0 ya(1) ya(4) 0.7 0.8 0.9 1.0 yb(1) yb(3) 0 yb(4) yb(2) 0.2 0.3 0.1 1.0 0 yc(3) yc(2) yc(4) yc(1)

  36. Example 0.25 0.5 r ya(2) 0 ya(1) 0.7 0.8 r yb(1) 0 yb(2) 0.2 0.1 r 0 yc(2) yc(1)

  37. Example 0.25 0.5 0.75 1.0 ya(2) ya(3) 0 ya(1) ya(4) 0.7 0.8 0.9 1.0 yb(1) yb(3) 0 yb(4) yb(2) 0.2 0.3 0.1 1.0 0 yc(3) yc(2) yc(4) yc(1)

  38. Example 0.2 0.3 0.1 1.0 0 yc(3) yc(2) yc(4) yc(1)

  39. Example 0.1 0.2 r yc(3) yc(2) 0 -yc(1) -yc(1)

  40. Example 0.25 0.5 0.75 1.0 ya(2) ya(3) 0 ya(1) ya(4) 0.7 0.8 0.9 1.0 yb(1) yb(3) 0 yb(4) yb(2) 0.2 0.3 0.1 1.0 0 yc(3) yc(2) yc(4) yc(1)

  41. Interval Move A move that mimics interval rounding Considers all variables and an interval of labels Changes labeling iteratively

  42. Key Observation If d is submodular d(i,k) + d(i+1,k+1) ≤ d(i,k+1) + d(i+1,k), for all i, k energy can be minimized via minimum cut Schlesinger and Flach, 2003

  43. Interval Move Choose an interval of length h’ Va Vb θab(i,k) = wabd(i,k)

  44. Interval Move Choose an interval of length h’ Add the current labels Va Vb θab(i,k) = wabd(i,k)

  45. Interval Move Choose an interval of length h’ Add the current labels d’(i,k) ≥ d(i,k) d’ is submodular Solve to update labels Va Vb Repeat until convergence θab(i,k) = wabd’(i,k)

  46. Interval Move Each problem can be solved using minimum cut Same multiplicative bound as interval rounding Multiplicative bound is tight

  47. Boykov, Veksler and Zabih Length of interval = 1 Move-Making Algorithms Kleinberg and Tardos Length of interval = 1 Convex Relaxations

  48. Boykov, Veksler and Zabih Length of interval = 1 Move-Making Algorithms Chekuri, Khanna, Naor and Zosin Optimal interval length Convex Relaxations

  49. Theoretical Guarantees M = ratio of maximum and minimum non-zero distance

More Related