1 / 39

Introduction to several works and Some Ideas

Introduction to several works and Some Ideas. Songcan Chen 2012.9.4. Outlines. Introduction to Several works Some ideas from Sparsity Aware. Introduction to Several works. A Least-Squares Framework for Component Analysis (CA)[1]

bozica
Download Presentation

Introduction to several works and Some Ideas

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introduction to several works and Some Ideas Songcan Chen 2012.9.4

  2. Outlines • Introduction to Several works • Some ideas from Sparsity Aware

  3. Introduction to Several works • A Least-Squares Framework for Component Analysis (CA)[1] • On the convexity of log-sum-exp functions with positive definite matrices[2]

  4. Some Ideas • Motivated by CA framework [1] • Motivated by Log-Sum-Exp [2] • Motivated by Sparsity Aware[3-4]

  5. CA framework

  6. Proposes a unified least-squares framework, called least-squares weighted kernel reduced rankregression (LS-WKRRR), to formulate many CA methods, As a result, PCA, LDA, CCA, SC, LE and their kernel versions become its special cases. • LS-WKRRR’s benefits • provides a clean connection between many CA techniques • yields efficient numerical schemes to solve CA techniques (3) overcomes the small sample size problem; (4) provides a framework to easily extend CA methods. For example, weighted generalizations of PCA, LDA, SC, and CCA, and several new CA techniques.

  7. The LS-WKRRR problem minimizes the followingexpression: Where Factors: Weights: Data:

  8. Solutions to A and B A GEP:

  9. Computational Aspects • Subspace Iteration • Alternated Least Squares (ALS) • Gradient Descent and Second-Order Methods Important to notice: both the ALS and the gradient-based algorithms effectively solve the SSS problem, unlike those that directly solve the GEP.

  10. PCA,KPCA AND WEIGHTED EXTENSIONS • PCA: That is, in (1), set Or alternative formulation:

  11. KPCA & WEIGHTED EXTENSIONS KPCA: Weighted PCA:

  12. LDA, KLDA and Weighted Extensions • LDA: In (1), Set G is label matrix using one-of-c encoding for c classes!

  13. In (1), set CCA, KCCA and Weighted Extensions • CCA

  14. The relations to LLE, LE etc. • Please refer to [1]

  15. On the convexity of log-sum-exp functions with positive definite (PD) matrices [2]

  16. Log-Sum-Exp (LSE) function • One of the fundamental functions in convex analysis is the LSE function whose convexity is the core ingredient in the methodology of geometric programming (GP) which has made considerable impact in different fields, e.g., power control in communication theory! This paper • Extends these results and consider the convexity of the log-determinant of a sum of rank one PD matrices with scalar exponential weights!

  17. LSE function (convex):

  18. Extending convexity of vector- function to matrix-variablefor PD A general convexity definition: Where between any two points q0 and q1 in the domain.

  19. Several Definitions:

  20. More general,

  21. Applications • Robust covariance estimation • Kronecker structured covariance estimation • Hybrid Robust Kronecker model

  22. Robust covariance estimation Assume: The ML objective:

  23. The objective is convex in 1/qi and its minimizers are Plugging this solution back into the objective, results in A key lemma:

  24. Applying this lemma to (37) yields Plugging it back into the objective yields

  25. For avoiding ill-condition, regularize (37) and minimize

  26. Other priors added if available: 1) Bounded peak values: 2) Bounded second moment: 3) Smoothness: 4) Sparsity:

  27. Kronecker structured covariance estimation The basic Kronecker model is The ML objective:

  28. Use The problem (58) turns to

  29. Hybrid Robust Kronecker Model The ML objective: Solving for Σ>0 again via Lemma 4 yields

  30. the problem (73) reduces to Solve (75) using the fixed point iteration Arbitrary can be used as initial iteration.

  31. Some Ideas • Motivated by CA framework [1] • Motivated by Log-Sum-Exp [2] • Motivated by Sparsity Aware [3][4]

  32. Motivated by CA framework [1] Recall

  33. … …

  34. Motivated by Log-Sum-Exp [2] 1) Metric Learning (ML) ML&CL, Relative Distance constraints, LMNN-like,… 2) Classification learning Predictive function: f(X)=tr(WTX)+b; The objective:

  35. ML across heterogeneous domains 2 lines: 1) Line 1: 2) Line 2 (for ML&CL) Symmetry and PSD An indefinite measure ({Ui} is base & {αi} is sparsified) Implying that 2 lines can be unified to a common indefinite ML!

  36. Motivated by Sparsity Aware [3][4] Noise model • Where c is the c-th class or cluster, eci is noise and oci is outlier and its ||oci||≠0 if outlier, 0 otherwise. • Discuss: • Uc=0, oci=0; eci~N(0, dI)  Means; Lap(0,dI)  Medians; other priors  other statistics • Uc≠0, oci=0; eci~ N(0, dI) PCA; Lap(0,dI) L1-PCA; • other priorsother PCAs;

  37. 3) Uc=0, oci ≠0; eci~N(0, dI)  Robust (k-)Means; ~ Lap(0,dI)  (k-)Medians; 4) Subspace Uc≠0, oci ≠0; eci~N(0, dI)  Robust k-subspaces; 5) mc=0 …… 6) Robust (Semi-)NMF …… 7) Robust CA …… where noise model:Γ=BATΥ+E+O

  38. Reference [1] Fernando De la Torre, A Least-Squares Framework for Component Analysis, IEEE TPAMI,34(6) 2012: 1041-1055. [2] Ami Wiesel, On the convexity of log-sum-exp functions with positive definite matrices, available at http://www.cs.huji.ac.il/~amiw/ [3] Gonzalo Mateos & Georgios B. Giannakis, Robust PCA as Bilinear Decomposition with Outlier-Sparsity Regularization, available at homepage of Georgios B. Giannakis. [4] Pedro A. Forero, Vassilis Kekatos & Georgios B. Giannakis, Robust Clustering Using Outlier-Sparsity Regularization, available at homepage of Georgios B. Giannakis.

  39. Thanks! Q&A

More Related