1 / 56

Matrix Extensions to Sparse Recovery

Matrix Extensions to Sparse Recovery. Yi Ma 1,2 Allen Yang 3 John Wright 1. 2 University of Illinois at Urbana-Champaign. 3 University of California Berkeley. 1 Microsoft Research Asia. CVPR Tutorial, June 20, 2009. TexPoint fonts used in EMF.

erv
Download Presentation

Matrix Extensions to Sparse Recovery

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Matrix Extensions to Sparse Recovery Yi Ma1,2Allen Yang3 John Wright1 2University of Illinois at Urbana-Champaign 3University of California Berkeley 1Microsoft Research Asia CVPR Tutorial, June 20, 2009 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAAAAAAAA

  2. FINAL TOPIC – Generalizations: sparsity to degeneracy The tools and phenomena underlying sparse recovery generalize very nicely to low-rank matrix recovery ???

  3. FINAL TOPIC – Generalizations: sparsity to degeneracy The tools and phenomena underlying sparse recovery generalize very nicely to low-rank matrix recovery Matrix completion: Given an incomplete subset of the entries of a low-rank matrix, fill in the missing values. Robust PCA: Given a low-rank matrix which has been grossly corrupted, recover the original matrix.

  4. THIS TALK – From sparse recovery to low-rank recovery Examples of degenerate data: Face images Degeneracy: illumination models Errors: occlusion, corruption Relevancy data Degeneracy: user preferences co-predict Errors: Missing rankings, manipulation ??? Video Degeneracy: temporal, dynamic structures Errors: anomalous events, mismatches…

  5. KEY ANALOGY – Connections between rank and sparsity

  6. KEY ANALOGY – Connections between rank and sparsity This talk: exploiting this connection for matrix completion and RPCA

  7. CLASSICAL PCA – Fitting degenerate data If degenerate observations are stacked as columns of a matrix then

  8. CLASSICAL PCA – Fitting degenerate data If degenerate observations are stacked as columns of a matrix then Principal Component Analysis via singular value decomposition: • Stable, efficient computation • Optimal estimate of under iid Gaussian noise • Fundamental statistical tool, huge impact in vision, search, • bioinformatics

  9. CLASSICAL PCA – Fitting degenerate data If degenerate observations are stacked as columns of a matrix then Principal Component Analysis via singular value decomposition: • Stable, efficient computation • Optimal estimate of under iid Gaussian noise • Fundamental statistical tool, huge impact in vision, search, • bioinformatics But… PCA breaks down under even a single corrupted observation.

  10. ROBUST PCA – Problem formulation A – low-rank E – sparse error D - observation … … … Problem: Given recover . Low-rank structure Sparse errors • Properties of the errors: • Each multivariate data sample (column) may be corrupted in some entries • Corruption can be arbitrarily large in magnitude (not Gaussian!)

  11. ROBUST PCA – Problem formulation A – low-rank E – sparse error D - observation … … … Problem: Given recover . Low-rank structure Sparse errors • Numerous heuristic methods in the literature: • Random sampling [Fischler and Bolles ‘81] • Multivariate trimming [Gnanadesikan and Kettering ‘72] • Alternating minimization [Ke and Kanade ‘03] • Influence functions [de la Torre and Black ‘03] • No polynomial-time algorithm with strong performance guarantees!

  12. ROBUST PCA – Semidefinite programming formulation Seek the lowest-rank that agrees with the data up to some sparse error:

  13. ROBUST PCA – Semidefinite programming formulation Seek the lowest-rank that agrees with the data up to some sparse error: Not directly tractable, relax:

  14. ROBUST PCA – Semidefinite programming formulation Seek the lowest-rank that agrees with the data up to some sparse error: Not directly tractable, relax: Convex envelope over Semidefinite program, solvable in polynomial time

  15. MATRIX COMPLETION – Motivation for the nuclear norm Related problem: we observe only a small knownsubset of entries of a rank- matrix . Can we exactly recover ?

  16. MATRIX COMPLETION – Motivation for the nuclear norm Related problem: recover a rank matrix from a knownsubset of entries Convex optimization heuristic [Candes and Recht] : For incoherent , exact recovery with [Candes and Tao] Spectral trimming also succeeds with for [Keshavan, Montanari and Oh]

  17. ROBUST PCA – Exact recovery? CONJECTURE: If with sufficiently low-rank and sufficiently sparse, then solving exactly recovers . Empirical evidence: probability of correct recovery vs rank and sparsity Sparsity of error Perfect recovery Rank

  18. ROBUST PCA – Which matrices and which errors? Fundamental ambiguity – very sparse matrices are also low-rank: Decompose as or ? rank-1 0-sparse 1-sparse rank-0 Obviously we can only hope to uniquely recover that are incoherent with the standard basis. Can we recover almost all low-rank matrices from almost all sparse errors?

  19. ROBUST PCA – Which matrices and which errors? Random orthogonal model (of rank r) [Candes & Recht ‘08]: independent samples from invariant measure on Steifel manifold of orthobases of rank r. arbitrary.

  20. ROBUST PCA – Which matrices and which errors? Random orthogonal model (of rank r) [Candes & Recht ‘08]: independent samples from invariant measure on Steifel manifold of orthobases of rank r. arbitrary. Bernoulli error signs-and-support (with parameter ): Magnitude of is arbitrary.

  21. MAIN RESULT – Exact Solution of Robust PCA “Convex optimization recovers almost any matrix of rank from errors affecting of the observations!”

  22. BONUS RESULT – Matrix completion in proportional growth “Convex optimization exactly recovers matrices of rank , even with entries missing!”

  23. MATRIX COMPLETION – Contrast with literature • [Candes and Tao 2009]: • Correct completion whp for Does not apply to the large-rank case • This work: • Correct completion whp for even with Proof exploits rich regularity and independence in random orthogonal model. Caveats: - [C-T ‘09] tighter for small r. - [C-T ‘09] generalizes better to other matrix ensembles.

  24. MAIN RESULT – Exact Solution of Robust PCA “Convex optimization recovers almost any matrix of rank from errors affecting of the observations!”

  25. ROBUST PCA – Solving the convex program Semidefinite program in millions of unknowns. Scalable solution: apply a first-order method with convergence to Sequence of quadratic approximations [Nesterov, Beck & Teboulle]: Solved via soft thresholding(E), and singular value thresholding (A).

  26. ROBUST PCA – Solving the convex program • Iteration complexity for suboptimal solution. • Dramatic practical gains from continuation

  27. SIMULATION – Recovery in various growth scenarios Correct recovery with and fixed, increasing. Empirically, almost constant number of iterations: Provably robust PCA at only a constant factor more computation than conventional PCA.

  28. SIMULATION – Phase Transition in Rank and Sparsity [0,1] x [0,1] [0,.4] x [0,.4] Fraction of successes with , varying (10 trials) [0,1] x [0,1] [0,.5] x [0,.5] Fraction of successes with , varying (65 trials)

  29. EXAMPLE – Background modeling from video Static camera surveillance video 200 frames, 72 x 88 pixels, Significant foreground motion Video Low-rank appx. Sparse error

  30. EXAMPLE – Background modeling from video Static camera surveillance video 550 frames, 64 x 80 pixels, significant illumination variation Video Low-rank appx. Sparse error Background variation Anomalous activity

  31. EXAMPLE – Faces under varying illumination 29 images of one person under varying lighting: … RPCA …

  32. EXAMPLE – Faces under varying illumination 29 images of one person under varying lighting: Specularity … RPCA Self- shadowing …

  33. EXAMPLE – Face tracking and alignment Initial alignment, inappropriate for recognition:

  34. EXAMPLE – Face tracking and alignment

  35. EXAMPLE – Face tracking and alignment

  36. EXAMPLE – Face tracking and alignment

  37. EXAMPLE – Face tracking and alignment

  38. EXAMPLE – Face tracking and alignment

  39. EXAMPLE – Face tracking and alignment

  40. EXAMPLE – Face tracking and alignment

  41. EXAMPLE – Face tracking and alignment

  42. EXAMPLE – Face tracking and alignment

  43. EXAMPLE – Face tracking and alignment Final result: per-pixel alignment

  44. EXAMPLE – Face tracking and alignment Final result: per-pixel alignment

  45. SIMULATION – Phase Transition in Rank and Sparsity [0,1] x [0,1] [0,.4] x [0,.4] Fraction of successes with , varying (10 trials) [0,1] x [0,1] [0,.5] x [0,.5] Fraction of successes with , varying (65 trials)

  46. CONJECTURES – Phase Transition in Rank and Sparsity Hypothesized breakdown behavior as m  ∞ 1 0 0 1

  47. CONJECTURES – Phase Transition in Rank and Sparsity What we know so far: 1 This work 0 0 1 Classical PCA

  48. CONJECTURES – Phase Transition in Rank and Sparsity 1 0 0 1 CONJECTURE I: convex programming succeeds in proportional growth

  49. CONJECTURES – Phase Transition in Rank and Sparsity 1 0 0 1 CONJECTURE II: for small ranks , any fraction of errors can eventually be corrected. Similar to Dense Error Correction via L1 Minimization, Wright and Ma ‘08

  50. CONJECTURES – Phase Transition in Rank and Sparsity 1 0 0 1 CONJECTURE III: for any rank fraction, , there exists a nonzero fraction of errors that can eventually be corrected with high probability.

More Related