1 / 15

半年工作小结

半年工作小结. 报告人:吕小惠. 2011 年 8 月 25 日. 报告提纲. 学习了 Non-negative Matrix Factorization convergence proofs 学习了 Sparse Non-negative Matrix Factorization 算法 学习了线性代数中有关子空间等基础知识 学习了 Ma yi 等人提出的 Generalized PCA 的相关理论. GPCA 论文信息. Generalized Principal Component Analysis(GPCA)

novia
Download Presentation

半年工作小结

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 半年工作小结 报告人:吕小惠 2011年8月25日

  2. 报告提纲 学习了Non-negative Matrix Factorization convergence proofs 学习了Sparse Non-negative Matrix Factorization算法 学习了线性代数中有关子空间等基础知识 学习了Ma yi等人提出的Generalized PCA的相关理论

  3. GPCA论文信息 Generalized Principal Component Analysis(GPCA) René Vidal,Member IEEE;Yi Ma,Member IEEE;Shankar Sastry,Fellow IEEE IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE,VOL. 27,NO. 12, Page 1945-1959,December 2005

  4. GPCA从数据中寻找基向量

  5. GPCA Abstract This paper presents an algebro-geometricsolution to the problem of segmenting an unknown number of subspaces of unknown and varying dimensions from sample data points.

  6. GPCA Abstract We represent the subspaces with a set of homogeneous polynomials whose degree is the number of the subspaces and the derivatives at a data point give normal vectors to the subspace passing through the point.

  7. GPCA Abstract When the number of subspaces is known,we show that these polynomials can be estimated linearly from data ,hence subspace segmentation is reduced to classifying one point per subspace.

  8. GPCA Abstract We select these points optimally from the data set by minimizing certain distance function,thus dealing automatically with moderate noise in the data. A basis for the complement of each subspace is then recovered by applying standard PCA to the collection of derivatives(normal vectors).

  9. GPCA Abstract Extensions of GPCA that deal with data in a high-dimensional space and with an unknown number of subspaces are also presented. Our experiments on low-dimensional data show that GPCA out-performs existing algebraic algorithms based on polynomials factorization and provides a good initialization to iterative techniques such as K-subspace and Expectation Maximization.

  10. GPCA Abstract We also present applications of GPCA to computer vision problems such as face clustering, temporal video segmentation, and 3-D motion segmentation from point correspondences in multiple affine views.

  11. Subspace Segmentation We consider the following alternative extension of PCA to the case of data lying a union of subspaces as illustrated in Figure 1 for two subspaces in R3.

  12. Subspace Segmentation

  13. GPCA Theorem 1(Generalized Principal Component Analysis): A union of n subspaces of RD,can be represented with a set of homogeneous polynomials of degree n in D variables. These polynomials can be estimated linearly given enough sample points in general position in the subspaces.

  14. GPCA A basis for the complement of each subspace can be obtained from the derivatives of these polynomials at a point in each of the subspaces. Such points can be recursively selected via polynomial division. Therefore, the subspace segmentation problem is mathematically equivalent to fitting,differentiating,and dividing a set of homogeneous polynomials.拟合,微分,分解

  15. Thank you! 报告人:吕小惠 2011年8月25日

More Related