1 / 42

Spectral Matting

Spectral Matting. A. Levin D. Lischinski and Y. Weiss. A Closed Form Solution to Natural Image Matting. IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), June 2006, New York

erno
Download Presentation

Spectral Matting

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Spectral Matting A. Levin D. Lischinski and Y. Weiss. A Closed Form Solution to Natural Image Matting. IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), June 2006, New York A. Levin, A. Rav-Acha, D. Lischinski. Spectral Matting. Best paper award runner up. IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), Minneapolis, June 2007 A. Levin1,2, A. Rav-Acha1, D. Lischinski1. Spectral Matting. IEEE Trans. Pattern Analysis and Machine Intelligence, Oct 2008. 1School of CS&Eng The Hebrew University 2CSAIL MIT

  2. Hard segmentation and matting compositing Hard segmentation Source image Matte compositing

  3. Unsupervised Previous approaches to segmentation and matting Input Hard output Matte output Spectral segmentation: Shi and Malik 97 Yu and Shi 03 Weiss 99 Ng et al 01 Zelnik and Perona 05 Tolliver and Miller 06

  4. Supervised Unsupervised Previous approaches to segmentation and matting Input Hard output Matte output July and Boykov01 Rother et al 04 Li et al 04

  5. Supervised Unsupervised Previous approaches to segmentation and matting Input Hard output Matte output ? Trimap interface: Bayesian Matting (Chuang et al 01) Poisson Matting (Sun et al 04) Random Walk (Grady et al 05) Scribbles interface: Wang&Cohen 05 Levin et al 06 Easy matting (Guan et al 06)

  6. User guided interface Matting result Scribbles Trimap

  7. x x = + Generalized compositing equation 2 layers compositing

  8. x x = + K layers compositing x x = + x x + + Matting components Generalized compositing equation 2 layers compositing

  9. x x = + x x + + Generalized compositing equation K layers compositing Matting components: “Sparse” layers- 0/1 for most image pixels

  10. Unsupervised matting Input Automatically computedmatting components

  11. Building foreground object by simple components addition + + =

  12. Spectral segmentation Spectral segmentation: Analyzing smallest eigenvectors of a graph Laplacian L E.g.: Shi and Malik 97 Yu and Shi 03 Weiss 99 Ng et al 01 Maila and shi 01 Zelnik and Perona 05 Tolliver and Miller 06

  13. x x = + Problem Formulation Assume a and b are constant in a small window

  14. Derivation of the cost function

  15. Derivation

  16. The matting Laplacian • semidefinite sparse matrix • local function of the image:

  17. The matting affinity

  18. The matting affinity Input Color Distribution

  19. Matting and spectral segmentation Typical affinity function Matting affinity function

  20. Eigenvectors of input image Input Smallest eigenvectors

  21. Null Spectral segmentation Fully separated classes: class indicator vectors belong to Laplacian nullspace General case: class indicators approximated as linear combinations of smallest eigenvectors Binary indicating vectors Laplacian matrix

  22. Zero eigenvectors Binary indicating vectors Laplacian matrix Smallest eigenvectors Linear transformation Spectral segmentation Fully separated classes: class indicator vectors belong to Laplacian nullspace General case: class indicators approximated as linear combinations of smallest eigenvectors Smallest eigenvectors- class indicators only up to linear transformation

  23. linear transformation From eigenvectors to matting components

  24. From eigenvectors to matting components Sparsity of matting components Minimize

  25. From eigenvectors to matting components Minimize Newton’s method with initialization

  26. K-means Projection into eigs space From eigenvectors to matting components 1) Initialization: projection of hard segments Smallest eigenvectors 2) Non linear optimization for sparse components

  27. Extracted Matting Components

  28. Brief Summary Construct Matting Laplacian Smallest eigenvectors Linear Transformation Matting components

  29. Grouping Components + + =

  30. Grouping Components • Unsupervised matting • User-guided matting Completeforeground matte + + =

  31. Unsupervised matting Matting cost function Hypothesis: Generate indicating vector b

  32. Unsupervised matting results

  33. User-guided matting • Graph cut method Energy function Unary term Pairwise term Constrained components

  34. Components with the scribble interface Components (our approach) Levin et al cvpr06 Wang&Cohen 05 Poisson Random Walk

  35. Components with the scribble interface Components (our approach) Levin et al cvpr06 Wang&Cohen 05 Poisson Random Walk

  36. Direct component picking interface Building foreground object by simple components addition + + =

  37. Results

  38. Quantitative evaluation

  39. Spectral matting versus obtaining trimaps from a hard segmentation

  40. Limitations • Number of eigenvectors Ground truth matte Matte from 70 eigenvectors Matte from 400 eigenvectors

  41. Limitations • Number of matting components

  42. Conclusion • Derived analogy between hard spectral segmentation to image matting • Automatically extract matting components from eigenvectors • Automate matte extraction process and suggest new modes of user interaction

More Related