1 / 14

Local Fisher Discriminant Analysis for Supervised Dimensionality Reduction

Local Fisher Discriminant Analysis for Supervised Dimensionality Reduction. Masashi Sugiyama. Presented by Xianwang Wang . Dimensionality Reduction. Goal Embed high-dimensional data to low-dimensional space Preserve intrinsic information Example. High dimension. 3-dimension. Categories.

seoras
Download Presentation

Local Fisher Discriminant Analysis for Supervised Dimensionality Reduction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Local Fisher Discriminant Analysis for Supervised Dimensionality Reduction Masashi Sugiyama Presented by Xianwang Wang

  2. Dimensionality Reduction • Goal • Embed high-dimensional data to low-dimensional space • Preserve intrinsic information • Example High dimension 3-dimension

  3. Categories • Nonlinear • ISOMAP • Locally Linear Embedding (LLE) • Laplacian Eigenmap (LE) • Linear • Principal Components Analysis (PCA) • Locality-Preserving Projection (LPP) • Fisher Discriminant Analysis (FDA) • Unsupervised • S-ISOMAP, S-LLE, PCA • Supervised • LPP, FDA

  4. Formulation • Number of samples: • d-dimensional samples: • Class labels : • Number of samples in the class : • Data matrix : • Embedded samples:

  5. Goal for linear dimensionality Reduction • Find a transformation matrix • Use Iris data for demos (http://archive.ics.uci.edu/ml/machine-learning-databases/iris/iris.data) • Attribute Information: • sepal length in cm • sepal width in cm • petal length in cm • petal width in cm • class: • Iris Setosa; Iris Versicolour; Iris Virginica

  6. FDA(1) • Mean of samples in the class • Mean of all samples • Within-class scatter matrix • Between-class scatter matrix

  7. FDA(2) • Maximize the following objective • Maximize the following constrained optimization problem equivalently • Use the lagrangian, • Apply KKT conditions • Demo

  8. LPP • Minimize • Equivalently • We can get • Demo

  9. Local Fisher Discriminant Analysis(LFDA) • FDA can perform poorly if samples in some class form several separate clusters • LPP can make samples of different classes overlapped if they are close in the original high dimensional space • LFDA combines the idea of FDA and LPP

  10. LFDA(1) • Reformulating FDA

  11. LFDA(2) • Definition of LFDA

  12. LFDA(3) • Maximize the following objective • Equivalently, • Similarly, we can get • Demo

  13. Conclusion • LFDA provided more separate embedding than FDA and LPP • FDA (globally), while LFDA(locally) • More discussion about efficiently computing of LFDA transformation matrix and Kernel LFDA in the paper

  14. Questions?

More Related