three algorithms for nonlinear dimensionality reduction
Download
Skip this Video
Download Presentation
Three Algorithms for Nonlinear Dimensionality Reduction

Loading in 2 Seconds...

play fullscreen
1 / 23

Three Algorithms for Nonlinear Dimensionality Reduction - PowerPoint PPT Presentation


  • 128 Views
  • Uploaded on

Three Algorithms for Nonlinear Dimensionality Reduction. Haixuan Yang Group Meeting Jan. 0 11, 2005. Outline. Problem PCA (Principal Component Analysis) MDS (Multidimentional Scaling) Isomap (isometric mapping)

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Three Algorithms for Nonlinear Dimensionality Reduction' - nixie


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
three algorithms for nonlinear dimensionality reduction

Three Algorithms for Nonlinear Dimensionality Reduction

Haixuan YangGroup Meeting

Jan. 011, 2005

outline
Outline
  • Problem
  • PCA (Principal Component Analysis)
  • MDS (Multidimentional Scaling)
  • Isomap (isometric mapping)
    • A Global Geometric Framework for Nonlinear Dimensionality Reduction. Science, 292(22), 2319-2323, 2000.
  • LLE (locally linear embedding)
    • Nonlinear Dimensionality Reduction by Locally Linear Embedding. Science, 292(22), 2323-2326, 2000.
  • Eigenmap
    • Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering. NIPS01.
problem
Problem
  • Given a set x1, …, xk of k points in Rl, find a set of

points y1, …, yk in Rm(m << l) such that yi “represents” xi as accurately as possible.

  • If the data xi is placed in a super plane in high dimensional space, the traditional algorithms, such as PCA and MDS, work well.
  • However, when the data xi is placed in a nonlinear manifold in high dimensional space, then the linear algebra technique can not work any more.
    • A nonlinear manifold can be roughly understood as a distorted super plane, which may be twisted, folded, or curved.
pca principal component analysis
PCA (Principal Component Analysis)
  • Reduce dimensionality of data by transformingcorrelated variables (bands) into a smaller number of uncorrelated components
  • Reveals meaningful latent information
  • Best preserves the variance as measured in the high-dimensional input space.
  • Nonlinear structure is invisible to PCA
first a graphical look at the problem
First, a graphical look at the problem…

Band 2

Two (correlated)

Bands of data

Band 1

rotate axes to create two orthogonal uncorrelated components
Rotate axes to create two orthogonal (uncorrelated) components

PC1

Band 2

PC2

“Reflected”

X- and y-axes

Band 1

partitioning of variance
Partitioning of Variance

PC1

Var(PC1)

Band 2

Var(PC2)

PC2

Band 1

pca algorithm description
PCA: algorithm description
  • Step 1: Calculate the average x of xi .
  • Step 2: Estimate the Covariance Matrix by
  • Step 3: Letλp be the p-th eigenvalue (in decreasing order) of the matrixM, and vpi be the i-th component of the p-th eignvector. Then set the p-th componet of the d-dimentional coordinate vector yiequal to
slide10
MDS
  • Step 1: Given the distance d(i, j) between i and j.
  • Step 2: From d(i, j), get the covariance matrix M by
  • Step3: The same as PCA
an example of embedding of a two dimentional manifold into a three dimentional space
An example of embedding of a two dimentional manifold into a three dimentional space

Not the true distance

The true distance

isomap basic idea
Isomap: basic idea
  • Learn the global distance by the local distance.
  • The local distance calculated by the Euclidean distance is relatively accurate because a patch in the nonlinear manifold looks like a plane when it is small, and therefore the direct Euclidean distance approximates the true distance in this small patch.
  • The global distance calculated by the Euclidean distance is not accurate because the manifold is curved.
  • Best preserve the estimated distance in the embedded space in the same way as MDS.
isomap algorithm description
Isomap: algorithm description

Step 1: Construct neighborhood graph

Define the graph over all data points by connecting points i and j if they are closer than ε (ε-Isomap), or if i is one of the n nearest neighbors of j (k-Isomap). Set edge lengths equal to dX(i,j).

Step 2: Compute shortest paths

Initialize dG(i,j)= dX(i,j) if i and j are linked by an edge; dG(i,j)= ∞

otherwise. Then compute the shortest path distances dG(i,j) between all

pairs of points in weighted graph G. LetDG=( dG(i,j) ).

Step 3: Construct d-dimensional embedding

Letλp be the p-th eigenvalue (in decreasing order) of the matrixτ(DG), and vpi be the i-th component of the p-th eignvector. Then set the p-th componet of the d-dimentional coordinate vector yiequal to .

lle basic idea
LLE: basic idea
  • Learn the local linear relation by the local data
  • The local data is relatively linear because a patch in the nonlinear manifold looks like a plane when it is small.
  • Globally the data is not linear because the manifold is curved.
  • Best preserve the local linear relation in the embedded space in the similar way as PCA.
lle algorithm description
LLE: algorithm description

Step 1: Discovering the Adjacency Information

For each xi find its n nearest neighbors, .

Step 2: Constrcting the Approximation Matrix

Choose Wij by minimizing

Under the condition that

Step 3: Compute the Embedding

The embedding vectors yi can be found by minimizing

eigenmap basic idea
Eigenmap: Basic Idea
  • Use the local information to decide the embedded data.
  • Motivated by the way that heat transmits from one point to another point.
eigenmap
Eigenmap

Step 1: Construct neighborhood graph

The same as Isomap.

Step 2: Compute the weights of the graph

If node i and node j are connected, put

Step 3: Construct d-dimensional embedding

Compute the eigenvalues and eigenvectors for the generalized eigenvector problem: , where D is a diagonal matrix, and

slide21
Cont.

Let f0,…,fk-1 be the solutions of the above equation,

ordered increasingly according to their eignvalues,

Lf0=λ0Df0

Lf1=λ1Df1

Lfk-1=λk-1Dfk-1

Then yi is determined by the ith component of the d

eigenvectors f1,…,fd .

conclusion
Conclusion
  • Isomap, LLE and Eigenmap can find the meaningful low-dimensional structure hidden in the high-dimensional observation.
  • These three algorithms work well especially in the nonlinear manifold. In such a case, the linear methods such as PCA and MDS can not work.
ad