combinatorial optimization and computer vision
Download
Skip this Video
Download Presentation
Combinatorial Optimization and Computer Vision

Loading in 2 Seconds...

play fullscreen
1 / 125

Combinatorial Optimization and Computer Vision - PowerPoint PPT Presentation


  • 63 Views
  • Uploaded on

Combinatorial Optimization and Computer Vision. Philip Torr. Story. How an attempt to solve one problem lead into many different areas of computer vision and some interesting results. Aim. Object Category Model. Given an image, to segment the object. Segmentation. Cow Image.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Combinatorial Optimization and Computer Vision' - trygg


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
story
Story
  • How an attempt to solve one problem lead into many different areas of computer vision and some interesting results.
slide3
Aim

Object

Category

Model

  • Given an image, to segment the object

Segmentation

Cow Image

Segmented Cow

  • Segmentation should (ideally) be
  • shaped like the object e.g. cow-like
  • obtained efficiently in an unsupervised manner
  • able to handle self-occlusion
challenges
Challenges

Shape Variability

Appearance Variability

Self Occlusion

motivation
Motivation
  • Current methods require user intervention
  • Object and background seed pixels (Boykov and Jolly, ICCV 01)
  • Bounding Box of object (Rother et al. SIGGRAPH 04)

Object Seed Pixels

Cow Image

motivation1
Motivation
  • Current methods require user intervention
  • Object and background seed pixels (Boykov and Jolly, ICCV 01)
  • Bounding Box of object (Rother et al. SIGGRAPH 04)

Object Seed Pixels

Background Seed Pixels

Cow Image

motivation2
Motivation
  • Current methods require user intervention
  • Object and background seed pixels (Boykov and Jolly, ICCV 01)
  • Bounding Box of object (Rother et al. SIGGRAPH 04)

Segmented Image

motivation3
Motivation
  • Current methods require user intervention
  • Object and background seed pixels (Boykov and Jolly, ICCV 01)
  • Bounding Box of object (Rother et al. SIGGRAPH 04)

Object Seed Pixels

Background Seed Pixels

Cow Image

motivation4
Motivation
  • Current methods require user intervention
  • Object and background seed pixels (Boykov and Jolly, ICCV 01)
  • Bounding Box of object (Rother et al. SIGGRAPH 04)

Segmented Image

motivation5
Motivation
  • Problem
  • Manually intensive
  • Segmentation is not guaranteed to be ‘object-like’

Non Object-like Segmentation

mrf for image segmentation
MRF for Image Segmentation

Boykov and Jolly [ICCV 2001]

EnergyMRF

=

Unary likelihood

Contrast Term

Pair-wise terms

(Potts Model)

Maximum-a-posteriori (MAP) solution x*= arg min E(x)

x

Data (D)

Unary likelihood

Pair-wise Terms

MAP Solution

graphcut for inference
GraphCut for Inference

Source

Foreground

Cut

Image

Background

Sink

Cut:A collection of edges which separates the Source from the Sink

MinCut:The cut with minimum weight (sum of edge weights)

Solution:Global optimum (MinCut) in polynomial time

energy minimization using graph cuts
Energy Minimization using Graph cuts

Graph Construction for Boolean Random Variables

EMRF(a1,a2)

Source (0)

a1

a2

Sink (1)

energy minimization using graph cuts1
Energy Minimization using Graph cuts

EMRF(a1,a2) =2a1

Source (0)

2

t-edges

(unary terms)

a1

a2

Sink (1)

energy minimization using graph cuts2
Energy Minimization using Graph cuts

EMRF(a1,a2) = 2a1 + 5ā1

Source (0)

2

a1

a2

5

Sink (1)

energy minimization using graph cuts3
Energy Minimization using Graph cuts

EMRF(a1,a2) = 2a1 + 5ā1+ 9a2 + 4ā2

Source (0)

2

9

a1

a2

5

4

Sink (1)

energy minimization using graph cuts4
Energy Minimization using Graph cuts

EMRF(a1,a2) = 2a1 + 5ā1+ 9a2 + 4ā2 +2a1ā2

Source (0)

2

9

a1

a2

2

5

4

n-edges

(pair-wise term)

Sink (1)

energy minimization using graph cuts5
Energy Minimization using Graph cuts

EMRF(a1,a2) = 2a1 + 5ā1+ 9a2 + 4ā2 + 2a1ā2 +ā1a2

Source (0)

2

9

1

a1

a2

2

5

4

Sink (1)

energy minimization using graph cuts6
Energy Minimization using Graph cuts

EMRF(a1,a2) = 2a1 + 5ā1+ 9a2 + 4ā2 + 2a1ā2 +ā1a2

Source (0)

2

9

1

a1

a2

2

5

4

Sink (1)

energy minimization using graph cuts7
Energy Minimization using Graph cuts

EMRF(a1,a2) = 2a1 + 5ā1+ 9a2 + 4ā2 + 2a1ā2 +ā1a2

Source (0)

2

9

Cost of st-cut = 11

1

a1

a2

a1 = 1 a2 = 1

2

5

4

EMRF(1,1) = 11

Sink (1)

energy minimization using graph cuts8
Energy Minimization using Graph cuts

EMRF(a1,a2) = 2a1 + 5ā1+ 9a2 + 4ā2 + 2a1ā2 +ā1a2

Source (0)

2

9

Cost of st-cut = 8

1

a1

a2

a1 = 1 a2 = 0

2

5

4

EMRF(1,0) = 8

Sink (1)

computing the st mincut from max flow algorithms
Computing the st-mincut from Max-flow algorithms

Source (0)

  • The Max-flow Problem
    • Edge capacity and flow balance constraints

2

9

  • Notation
    • Residual capacity
    • (edge capacity – current flow)

1

a1

a2

2

5

4

  • Simple Augmenting Path based Algorithms
    • Repeatedly find augmenting paths and push flow.
    • Saturated edges constitute the st-mincut.
    • [Ford-Fulkerson Theorem]

Sink (1)

minimum s t cuts algorithms
Minimum s-t cuts algorithms
  • Augmenting paths [Ford & Fulkerson, 1962]
  • Push-relabel [Goldberg-Tarjan, 1986]
augmenting paths

“source”

“sink”

T

S

A graph with two terminals

“Augmenting Paths”
  • Find a path from S to T along non-saturated edges
  • Increase flow along this path until some edge saturates
augmenting paths1

“source”

“sink”

T

S

A graph with two terminals

“Augmenting Paths”
  • Find a path from S to T along non-saturated edges
  • Increase flow along this path until some edge saturates
  • Find next path…
  • Increase flow…
augmenting paths2

“source”

“sink”

T

S

A graph with two terminals

 MIN CUT

“Augmenting Paths”
  • Find a path from S to T along non-saturated edges
  • Increase flow along this path until some edge saturates

Iterate until … all paths from S to T have at least one saturated edge

MAX FLOW

mrf graphical model
MRF, Graphical Model
  • Probability for a labellingconsists of
  • Likelihood Unary potential based on colour of pixel
  • Prior which favours same labels for neighbours (pairwise potentials)

mx

m(labels)

Prior Ψxy(mx,my)

my

Unary Potential Φx(D|mx)

x

y

D(pixels)

Image Plane

example
Example

Cow Image

Object Seed

Pixels

Background Seed

Pixels

Φx(D|obj)

x

x

Φx(D|bkg)

Ψxy(mx,my)

y

y

Prior

Likelihood Ratio (Colour)

example1
Example

Cow Image

Object Seed

Pixels

Background Seed

Pixels

Pair-wise Terms

Likelihood Ratio (Colour)

contrast dependent mrf
Contrast-Dependent MRF
  • Probability of labelling in addition has
  • Contrast term which favours boundaries to lie on image edges

mx

m(labels)

my

x

Contrast Term

Φ(D|mx,my)

y

D(pixels)

Image Plane

example2
Example

Cow Image

Object Seed

Pixels

Background Seed

Pixels

Φx(D|obj)

x

x

Φx(D|bkg)

Ψxy(mx,my)+

Φ(D|mx,my)

y

y

Pair-wise Term

Likelihood Ratio (Colour)

example3
Example

Cow Image

Object Seed

Pixels

Background Seed

Pixels

Prior + Contrast

Likelihood Ratio (Colour)

object graphical model
Object Graphical Model
  • Probability of labelling in addition has
  • Unary potential which depend on distance from Θ (shape parameter)

Θ (shape parameter)

Unary Potential

Φx(mx|Θ)

mx

m(labels)

my

Object Category

Specific MRF

x

y

D(pixels)

Image Plane

example4
Example

Cow Image

Object Seed

Pixels

Background Seed

Pixels

ShapePriorΘ

Prior + Contrast

Distance from Θ

example5
Example

Cow Image

Object Seed

Pixels

Background Seed

Pixels

ShapePriorΘ

Prior + Contrast

Likelihood + Distance from Θ

example6
Example

Cow Image

Object Seed

Pixels

Background Seed

Pixels

ShapePriorΘ

Prior + Contrast

Likelihood + Distance from Θ

thought
Thought
  • We can imagine rather than using user input to define histograms we use object detection.
shape model
Shape Model
  • BMVC 2004
slide39

Yuille, ‘91

  • Brunelli & Poggio, ‘93
  • Lades, v.d. Malsburg et al. ‘93
  • Cootes, Lanitis, Taylor et al. ‘95
  • Amit & Geman, ‘95, ‘99
  • Perona et al. ‘95, ‘96, ’98, ‘00

Pictorial Structure

Fischler & Elschlager, 1973

layered pictorial structures lps
Layered Pictorial Structures (LPS)
  • Generative model
  • Composition of parts + spatial layout

Layer 2

Spatial Layout

(Pairwise Configuration)

Layer 1

Parts in Layer 2 can occlude parts in Layer 1

layered pictorial structures lps1
Layered Pictorial Structures (LPS)

Cow Instance

Layer 2

Transformations

Θ1

P(Θ1) = 0.9

Layer 1

layered pictorial structures lps2
Layered Pictorial Structures (LPS)

Cow Instance

Layer 2

Transformations

Θ2

P(Θ2) = 0.8

Layer 1

layered pictorial structures lps3
Layered Pictorial Structures (LPS)

Unlikely Instance

Layer 2

Transformations

Θ3

P(Θ3) = 0.01

Layer 1

how to learn lps
How to learn LPS
  • From video via motion segmentation see Kumar Torr and Zisserman ICCV 2005.
  • Graph cut based method.
lps for detection
LPS for Detection
  • Learning
    • Learnt automatically using a set of examples
  • Detection
    • Matches LPS to image using Loopy Belief Propagation
    • Localizes object parts
detection
Detection
  • Like a proposal process.
pictorial structures ps
Pictorial Structures (PS)

Fischler and Eschlager. 1973

PS = 2D Parts + Configuration

Aim: Learn pictorial structures in an unsupervised manner

Layered

Pictorial

Structures

(LPS)

Parts +

Configuration +

Relative depth

  • Identify parts
  • Learn configuration
  • Learn relative depth of parts
motivation6

P2

(x,y,,)

P1

P3

MRF

Image

Motivation

Matching Pictorial Structures - Felzenszwalb et al - 2001

Outline

Texture

Part likelihood

Spatial Prior

motivation7

YES

NO

2

1

P2

(x,y,,)

P1

P3

MRF

Image

Motivation

Matching Pictorial Structures - Felzenszwalb et al - 2001

  • Unary potentials are negative log likelihoods

Valid pairwise configuration

Potts Model

motivation8

YES

NO

2

1

Motivation

Matching Pictorial Structures - Felzenszwalb et al - 2001

  • Unary potentials are negative log likelihoods

Valid pairwise configuration

Potts Model

P2

(x,y,,)

P1

P3

Image

Pr(Cow)

bayesian formulation mrf
Bayesian Formulation (MRF)
  • D = image.
  • Di = pixels Є pi , given li
  • (PDF Projection Theorem. )

z = sufficient statistics

  • ψ(li,lj) = const, if valid configuration

= 0, otherwise.

Pott’s model

combinatorial optimization
Combinatorial Optimization
  • SDP formulation (Torr 2001, AI stats), best bound
  • SOCP formulation (Kumar, Torr & Zisserman this conference), good compromise of speed and accuracy.
  • LBP (Huttenlocher, many), worst bound.
defining the likelihood
Defining the likelihood
  • We want a likelihood that can combine both the outline and the interior appearance of a part.
  • Define features which will be sufficient statistics to discriminate foreground and background:
features
Features
  • Outline: z1 Chamfer distance
  • Interior: z2 Textons
  • Model joint distribution of z1 z2 as a 2D Gaussian.
chamfer match score
Chamfer Match Score
  • Outline (z1) : minimum chamfer distances over multiple outline exemplars
  • dcham= 1/n Σi min{ minj ||ui-vj ||, τ }

Image

Edge Image

Distance Transform

texton match score
Texton Match Score
  • Texture(z2) : MRF classifier
        • (Varma and Zisserman, CVPR ’03)
  • Multiple texture exemplars x of class t
  • Textons: 3 X 3 square neighbourhood
  • VQ in texton space
  • Descriptor: histogram of texton labelling
  • χ2 distance
bag of words histogram of textons
Bag of Words/Histogram of Textons
  • Having slagged off BoW’s I reveal we used it all along, no big deal.
  • So this is like a spatially aware bag of words model…
  • Using a spatially flexible set of templates to work out our bag of words.
2 fitting the model
2. Fitting the Model
  • Cascades of classifiers
    • Efficient likelihood evaluation
  • Solving MRF
    • LBP, use fast algorithm
    • GBP if LBP doesn’t converge
    • Could use Semi Definite Programming (2003)
    • Recent work second order cone programming method best CVPR 2006.
efficient detection of parts
Efficient Detection of parts
  • Cascade of classifiers
  • Top level use chamfer and distance transform for efficient pre filtering
  • At lower level use full texture model for verification, using efficient nearest neighbour speed ups.
cascade of classifiers for each part
Cascade of Classifiers-for each part
  • Y. Amit, and D. Geman, 97?; S. Baker, S. Nayer 95
low levels on texture
Low levels on Texture
  • The top levels of the tree use outline to eliminate patches of the image.
  • Efficiency: Using chamfer distance and pre computed distance map.
  • Remaining candidates evaluated using full texture model.
efficient nearest neighbour
Efficient Nearest Neighbour
  • Goldstein, Platt and Burges (MSR Tech Report, 2003)

Conversion from fixed

distance to rectangle

search

  • bitvectorij(Rk) = 1
  • = 0
  • Nearest neighbour of x
  • Find intervals in all dimensions
  • ‘AND’ appropriate bitvectors
  • Nearest neighbour search on
  • pruned exemplars

RkЄ Ii

in dimension j

inspiration
Inspiration
  • ICCV 2003, Stenger et al.
  • System developed for tracking articulated objects such as hands or bodies, based on efficient detection.
evaluation at multiple resolutions
Evaluation at Multiple Resolutions
  • Tree: 9000 templates of hand pointing, rigid
marginalize out pose
Marginalize out Pose
  • Get an initial estimate of pose distribution.
  • Use EM to marginalize out pose.
results
Results

Using LPS Model for Cow

Image

Segmentation

results1
Results

Using LPS Model for Cow

In the absence of a clear boundary between object and background

Image

Segmentation

results2
Results

Using LPS Model for Cow

Image

Segmentation

results3
Results

Using LPS Model for Cow

Image

Segmentation

results4
Results

Using LPS Model for Horse

Image

Segmentation

results5
Results

Using LPS Model for Horse

Image

Segmentation

results6
Results

Image

Our Method

Leibe and Schiele

thoughts
Thoughts

Object models can help segmentation.

But good models hard to obtain.

do we really need accurate models
Do we really need accurate models?
  • Segmentation boundary can be extracted from edges
  • Rough 3D Shape-prior enough for region disambiguation
energy of the pose specific mrf
Energy of the Pose-specific MRF

Energy to be minimized

Pairwise potential

Unary term

Potts model

Shape prior

But what should be the value of θ?

the different terms of the mrf
The different terms of the MRF

Likelihood of being foreground given a foreground histogram

Likelihood of being foreground given all the terms

Shape prior model

Grimson-Stauffer segmentation

Shape prior (distance transform)

Resulting Graph-Cuts segmentation

Original image

solve via gradient descent
Solve via gradient descent
  • Comparable to level set methods
  • Could use other approaches (e.g. Objcut)
  • Need a graph cut per function evaluation
slide86

However…

  • Kohli and Torr showed how dynamic graph cuts can be used to efficiently find MAP solutions for MRFs that change minimally from one time instant to the next: Dynamic Graph Cuts (ICCV05).
But…

… to compute the MAP of E(x) w.r.t the pose, it means that the unary terms will be changed at EACH iteration and the maxflow recomputed!

dynamic graph cuts

solve

SA

differences

between

A and B

PB*

Simpler

problem

A and B

similar

SB

Dynamic Graph Cuts

PA

cheaper

operation

PB

computationally

expensive operation

slide88

Dynamic Image Segmentation

Image

Segmentation Obtained

Flows in n-edges

reparametrization
Reparametrization

Source (0)

Key Observation

9 + α

2

Adding a constant to both the

t-edges of a node does not change the edges constituting the st-mincut.

1

a1

a2

2

4 + α

5

Sink (1)

E (a1,a2) = 2a1 + 5ā1+ 9a2 + 4ā2 + 2a1ā2 +ā1a2

E*(a1,a2 ) = E(a1,a2) + α(a2+ā2)

= E(a1,a2) + α [a2+ā2 =1]

reparametrization second type
Reparametrization, second type

Source (0)

Other type of reparametrization

9 + α

2

All reparametrizations of the graph are sums of these two types.

1 - α

a1

a2

2 + α

5 + α

4

Sink (1)

E* (a1,a2) = E (a1,a2) + αā1+ αa2 + αa1ā2 - αā1a2

E* (a1,a2) = E (a1,a2) + α (ā1+ a2 + a1(1-a2)- ā1a2)

E* (a1,a2) = E (a1,a2) + α

reparametrization second type1
Reparametrization, second type

Source (0)

Other type of reparametrization

9 + α

2

All reparametrizations of the graph are sums of these two types.

1 - α

a1

a2

2 + α

5 + α

4

Sink (1)

Both maintain the solution and add a constant α to the energy.

reparametrization1
Reparametrization
  • Nice result (easy to prove)
  • All other reparametrizations can be viewed in terms of these two basic operations.
  • Proof in Hammer, and also in one of Vlad’s recent papers.
graph re parameterization
Graph Re-parameterization

s

flow/residual capacity

0/7

0/1

0/5

xi

xj

0/9

0/2

0/4

t

G

original graph

graph re parameterization1
Graph Re-parameterization

Edges cut

s

flow/residual capacity

5/2

1/0

0/7

0/1

Compute

Maxflow

3/2

0/5

xi

xj

xi

xj

0/12

0/9

st-mincut

2/0

4/0

0/2

0/4

t

t

Gr

G

residual graph

original graph

update t edge capacities
Update t-edgeCapacities

s

5/2

1/0

3/2

xi

xj

0/12

2/0

4/0

t

Gr

residual graph

update t edge capacities1
Update t-edgeCapacities

s

capacity

changes from

7 to 4

5/2

1/0

3/2

xi

xj

0/12

2/0

4/0

t

Gr

residual graph

update t edge capacities2

excess flow (e) = flow – new capacity

= 5 – 4 = 1

Update t-edgeCapacities

s

capacity

changes from

7 to 4

5/-1

1/0

3/2

xi

xj

edge capacity

constraint violated!

(flow > capacity)

0/12

2/0

4/0

t

G`

updated residual graph

update t edge capacities3

excess flow (e) = flow – new capacity

= 5 – 4 = 1

add e to both t-edges

connected to node i

Update t-edgeCapacities

s

capacity

changes from

7 to 4

5/-1

1/0

3/2

xi

xj

edge capacity

constraint violated!

(flow > capacity)

0/12

2/0

4/0

t

G`

updated residual graph

update t edge capacities4
Update t-edgeCapacities

excess flow (e) = flow – new capacity

s

= 5 – 4 = 1

capacity

changes from

7 to 4

5/0

1/0

add e to both t-edges

connected to node i

3/2

xi

xj

edge capacity

constraint violated!

(flow > capacity)

0/12

2/1

4/0

t

G`

updated residual graph

update n edge capacities
Update n-edgeCapacities

s

  • Capacity changes from 5 to 2

5/2

1/0

3/2

xi

xj

0/12

2/0

4/0

t

residual graph

Gr

update n edge capacities1
Update n-edgeCapacities

s

  • Capacity changes from 5 to 2
    • - edge capacity constraint violated!

5/2

1/0

3/-1

xi

xj

0/12

2/0

4/0

t

Updated residual graph

G`

update n edge capacities2
Update n-edgeCapacities

s

  • Capacity changes from 5 to 2
    • - edge capacity constraint violated!
  • Reduce flow to satisfy constraint

5/2

1/0

3/-1

xi

xj

0/12

2/0

4/0

t

Updated residual graph

G`

update n edge capacities3
Update n-edgeCapacities

s

  • Capacity changes from 5 to 2
    • - edge capacity constraint violated!
  • Reduce flow to satisfy constraint
    • causes flow imbalance!

1/0

5/2

2/0

excess

xi

xj

0/11

deficiency

2/0

4/0

t

Updated residual graph

G`

update n edge capacities4
Update n-edgeCapacities

s

  • Capacity changes from 5 to 2
    • - edge capacity constraint violated!
  • Reduce flow to satisfy constraint
    • causes flow imbalance!
  • Push excess flow to/from the terminals
  • Create capacity by adding α = excess to both t-edges.

1/0

5/2

2/0

excess

xi

xj

0/11

deficiency

2/0

4/0

t

Updated residual graph

G`

update n edge capacities5
Update n-edgeCapacities

s

  • Capacity changes from 5 to 2
    • - edge capacity constraint violated!
  • Reduce flow to satisfy constraint
    • causes flow imbalance!
  • Push excess flow to the terminals
  • Create capacity by adding α = excess to both t-edges.

5/3

2/0

2/0

xi

xj

0/11

3/0

4/1

t

Updated residual graph

G`

update n edge capacities6
Update n-edgeCapacities

s

  • Capacity changes from 5 to 2
    • - edge capacity constraint violated!
  • Reduce flow to satisfy constraint
    • causes flow imbalance!
  • Push excess flow to the terminals
  • Create capacity by adding α = excess to both t-edges.

5/3

2/0

2/0

xi

xj

0/11

3/0

4/1

t

Updated residual graph

G`

slide107

Maximum flow

MAP solution

First segmentation problem

Ga

difference

between

Ga and Gb

residual graph (Gr)

second segmentation problem

updated residual graph

G`

Gb

Our Algorithm

dynamic graph cut vs active cuts
Dynamic Graph Cut vs Active Cuts
  • Our method flow recycling
  • AC cut recycling
  • Both methods: Tree recycling
experimental analysis
ExperimentalAnalysis

Running time of the dynamic algorithm

MRF consisting of 2x105 latent variables connected in a 4-neighborhood.

experimental analysis1
ExperimentalAnalysis

Image segmentation in videos (unary & pairwise terms)

EnergyMRF=

Image resolution: 720x576 static: 220 msec dynamic (optimized): 50 msec

Dynamic Graph Cuts

Graph Cuts

segmentation comparison
Segmentation Comparison

Grimson-Stauffer

Bathia04

Our method

segmentation pose inference
Segmentation + Pose inference

[Images courtesy: M. Black, L. Sigal]

segmentation pose inference1
Segmentation + Pose inference

[Images courtesy: Vicon]

max marginals for parameter learning
Max-Marginals for Parameter Learning
  • Use Max-marginals instead of Pseudo marginals from LBP (from Sanjiv Kumar)
volumetric graph cuts

Sink

Source

Min cut

Volumetric Graph cuts

Can apply to 3D

results7
Results
  • Model House
results8
Results
  • Stone carving
results9
Results
  • Haniwa
conclusion
Conclusion
  • Combining pose inference and segmentation worth investigating.
  • Lots more to do to extend MRF models
  • Combinatorial Optimization is a very interesting and hot area in vision at the moment.
  • Algorithms are as important as models.
ad