1 / 37

Mean-Field Theory and Its Applications In Computer Vision5

Mean-Field Theory and Its Applications In Computer Vision5. Global Co-occurrence Terms. Encourages global consistency and co-occurrence of objects. Without cooc. With co-occurrence. Global Co-occurrence Terms. Defined on subset of labels Associates a cost with each possible subset.

morgan
Download Presentation

Mean-Field Theory and Its Applications In Computer Vision5

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Mean-Field Theory and Its Applications In Computer Vision5

  2. Global Co-occurrence Terms • Encourages global consistency and co-occurrence of objects Without cooc With co-occurrence

  3. Global Co-occurrence Terms • Defined on subset of labels • Associates a cost with each possible subset

  4. Properties of cost function Non-decreasing 5.0 3.0 3.0 3.0 0.2 0.2 0.2

  5. Properties of cost function We represent our cost as second order cost function defined on binary vector:

  6. Complexity • Complexity: O(NL2) • Two relaxed (approximation) of this form • Complexity: O(NL+L2)

  7. Our model • Represent 2nd order cost by binary latent variables • Unary cost per latent variable label level variable node (0/1)

  8. Our model • Represent 2nd order cost by binary latent variables • Pairwise cost between latent variable

  9. Global Co-occurrence Cost • Two approximation to include into fully connected CRF

  10. Global Co-occurrence Terms • First model

  11. Global Co-occurrence Terms • Model

  12. Global Co-occurrence Terms • Constraints (lets take one set of connections) If latent variable is off, no image variable take that label If latent variable is on, atleast one of image variable take that label

  13. Global Co-occurrence Terms • Pay a cost K for violating first constraint

  14. Global Co-occurrence Terms • Pay a cost K for violating second constrait

  15. Global Co-occurrence Terms • Cost for first model:

  16. Global Co-occurrence Terms • Second model • Each latent node is connected to the variable node

  17. Global Co-occurrence Terms • Constraints (lets take one set of connections) If latent variable is off, no image variable take that label If latent variable is on, atleast one of image variable take that label

  18. Global Co-occurrence Terms • Pay a cost K for violating the constraint

  19. Global Co-occurrence Terms • Cost for second model:

  20. Global Co-occurrence Terms • Expectation evaluation for variable Yl • Case 1: Y_l takes label 0

  21. Global Co-occurrence Terms • Expectation evaluation for variable Yl • Case 1: Y_l takes label 0

  22. Global Co-occurrence Terms • Expectation evaluation for variable Yl • Case 1: Y_l takes label 0

  23. Global Co-occurrence Terms • Expectation evaluation for variable Yl • Case 1: Y_l takes label 1

  24. Global Co-occurrence Terms • Expectation evaluation for variable Yl • Case 1: Y_l takes label 1

  25. Global Co-occurrence Terms • Expectation evaluation for variable Yl

  26. Global Co-occurrence Terms • Latent variable updates:

  27. Global Co-occurrence Terms • Latent variable updates:

  28. Global Co-occurrence Terms Pay a cost K if variable takes a label l and corresponding latent variable takes label 0

  29. Complexity Expectation updates for latent variable Y_l

  30. Complexity Expectation updates for latent variable Y_l Overall complexity: Does not increase original complexity:

  31. PascalVOC-10 dataset Qualitative analysis: observe an improvement over other comparative methods

  32. PascalVOC-10 dataset Observe an improvement of almost 2.3% improvement Almost 8-9 times faster than alpha-expansion based method

  33. Mean-field Vs. Graph-cuts • Measure I/U score on PascalVOC-10 segmentation • Increase standard deviation for mean-field • Increase window size for graph-cuts method • Both achieve almost similar accuracy

  34. Window sizes • Comparison on matched energy Impact of adding more complex costs and increasing window size

  35. PascalVOC-10 dataset Per class Quantitative results

  36. PascalVOC-10 dataset Per class Quantitative results

  37. Mean-field Vs. Graph-cuts • Measure I/U score on PascalVOC-10 segmentation • Increase standard deviation for mean-field • Increase window size for graph-cuts method • Time complexity very high, making infeasible to work with large neighbourhood system

More Related