1 / 18

Hierarchical Image-Motion Segmentation using Swendsen-Wang Cuts

Hierarchical Image-Motion Segmentation using Swendsen-Wang Cuts. Adrian Barbu Siemens Corporate Research Princeton, NJ. Acknowledgements: S.C. Zhu , Y.N. Wu, A.L. Yuille et al. Talk Outline. The Swendsen-Wang Cuts algorithm The original Swendsen-Wang algorithm

camden
Download Presentation

Hierarchical Image-Motion Segmentation using Swendsen-Wang Cuts

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Hierarchical Image-Motion Segmentation using Swendsen-Wang Cuts Adrian Barbu Siemens Corporate Research Princeton, NJ Acknowledgements: S.C. Zhu , Y.N. Wu, A.L. Yuille et al.

  2. Talk Outline • The Swendsen-Wang Cuts algorithm • The original Swendsen-Wang algorithm • Generalization to arbitrary probabilities • Multi-Grid and Multi-Level Swendsen-Wang Cuts • Application: Hierarchical Image-Motion Segmentation • Conclusions and future work

  3. Swendsen-Wang for Ising / Potts Models Swedsen-Wang (1987) is an extremely smart idea that flips a patch at a time. Each edge in the lattice e=<s,t> is associated a probability q=e-b. 1. If s and t have different labels at the current state, e is turned off. If s and t have the same label, e is turned off with probability q. Thus each object is broken into a number of connected components (subgraphs). 2. One or many components are chosen at random. 3. The collective label is changed randomly to any of the labels.

  4. The Swendsen-Wang Algorithm Pros • Computationally efficient in sampling the Ising/Potts models Cons: • Limited to Ising / Potts models and factorized distributions • Not informed by data, slows down in the presence of an external field (data term) Swendsen Wang Cuts • Generalizes Swendsen-Wang to arbitrary posterior probabilities • Improves the clustering step by using the image data

  5. SW Cuts: the Acceptance Probability Theorem (Barbu,Zhu ‘03). The acceptance probability for the Swendsen-Wang Cuts algorithm is Theorem (Metropolis-Hastings) For any proposal probability q(AB) and probability p(A), if the Markov chain moves by taking samples from q(A  B) which are accepted with probability then the Markov chain is reversible with respect to p and has stationary distribution p.

  6. 1. Initialize a graph partition 2. Repeat, for current state A= π 3. Repeat for each subgraph Gl=<Vl, El>, l=1,2,...,n in A 4. For e El turn e=“on” with probability qe. 5. Partition Gl into nl connected components: gli=<Vli, Eli>, i=1,...,nl 6. Collect all the connected components in CP={Vli: l=1,...,n, i=1,...,nl}. 8. Propose to reassign V0 to a subgraph Gl’, l' follows a probability q(l'|V0,A) State A State B CP The Swendsen-Wang Cuts Algorithm Swendsen-Wang Cuts: SWCInput: Go=<V, Eo>, discriminative probabilities qe, e Eo, and generative posterior probability p(W|I).Output: Samples W~p(W|I). The initial graphGo 7. Select a connected component V0CP at random 9. Accept the move with probability α(AB).

  7. Advantages of the SW Cuts Algorithm • Our algorithm bridges the gap between the specialized and generic algorithms: • Generally applicable – allows usage of complex models beyond the scope of the specialized algorithms • Computationally efficient – performance comparable with the specialized algorithms • Reversible and ergodic – theoretically guaranteed to eventually find the global optimum

  8. Hierarchical Image-Motion Segmentation Three-level representation: • Level 2: Intensity regions are grouped into moving objects Oiwith motion parametersqi • Level 1: Atomic regions are grouped into intensity regions Rijof coherent motion with intensity models Hij • Level 0: Pixels are grouped into atomic regions rijkof relatively constant motion and intensity • motion parameters (uijk,vijk) • intensity histogram hijk

  9. Multi-Grid SWC Select an attention window ½ G. Cluster the vertices within  and select a connected component R Swap the label of R Accept the swap with probability , using as boundary condition.

  10. Multi-Level SWC • Select a level s, usually in an increasing order. • Cluster the vertices in G(s) and select a connected component R • Swap the label of R • Accept the swap with probability, using the lower levels, denoted by X(<s), as boundary conditions.

  11. Hierarchical Image-Motion Segmentation Modeling occlusion • Accreted (disoccluded) pixels • Motion pixels Bayesian formulation Accreted pixels Motion pixels explained by motion Intensity segmentation factor with generative and histogram models.

  12. Main motion for each object • Boundary length • Number of labels Hierarchical Image-Motion Segmentation The prior has factors for • Smoothness of motion

  13. Level 1: Histogram Hi Histogram Hj Level 2: Motion histogram Mi Motion histogram Mj Designing the Edge Weights • Level 0: • Pixel similarity • Common motion

  14. Experiments Input sequence Image Segmentation Motion Segmentation Input sequence Image Segmentation Motion Segmentation

  15. Experiments Input sequence Image Segmentation Motion Segmentation Input sequence Image Segmentation Motion Segmentation

  16. Conclusion Two extensions: • Swendsen-Wang Cuts • Samples arbitrary probabilities on Graph Partitions • Efficient by using data-driven techniques • Hundreds of times faster than Gibbs sampler • Marginal Space Learning • Constrain search by learning in Marginal Spaces • Six orders of magnitude speedup with great accuracy • Robust, complex statistical model by supervised learning

  17. Future Work • Algorithm Boosting • Any algorithm has a success rate and an error rate • Can combine algorithms into a more robust algorithm by supervised learning • Proof of concept for Image Registration • Hierarchical Computing • Efficient representation of Top-Down and Bottom-Up communication using specialized dictionaries • Robust integration of multiple MSL paths by Algorithm Boosting • Applications to medical imaging • 3D curve localization and tracking • Brain segmentation • Lymph node detection

  18. References • A. Barbu, S.C. Zhu. Generalizing Swendsen-Wang to sampling arbitrary posterior probabilities, IEEE Trans. PAMI, August 2005.http://www.stat.ucla.edu/~abarbu/Research/partition-pami.pdf • A. Barbu, S.C. Zhu. Generalizing Swendsen-Wang for Image Analysis. To appear in J. Comp. Graph. Stat. http://www.stat.ucla.edu/~abarbu/Research/jcgs.pdf Thank You!

More Related