1 / 42

Agenda

Agenda. Dimension reduction Principal component analysis (PCA) Multi-dimensional scaling (MDS) Microarray visualization. Why Dimension Reduction. Computation: The complexity grows exponentially with the dimension. Visualization: projection of high-dimensional data to 2D or 3D.

yahto
Download Presentation

Agenda

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Agenda • Dimension reduction • Principal component analysis (PCA) • Multi-dimensional scaling (MDS) • Microarray visualization

  2. Why Dimension Reduction • Computation: The complexity grows exponentially with the dimension. • Visualization: projection of high-dimensional data to 2D or 3D. • Interpretation: the intrinsic dimension maybe small.

  3. 1. Dimension reduction • Principal component analysis (PCA) • Multi-dimensional Scaling (MDS)

  4. Philosophy of PCA • A PCA is concerned with explaining the variance-covariance sturcture of a set of variables through a few linear combinations. • We typically have a data matrix of n observations on p correlated variables x1,x2,…xp • PCA looks for a transformation of the xiinto p new variables yithat are uncorrelated. • Want to present x1,x2,…xp with a few yi’s without lossing much information.

  5. PCA • Looking for a transformation of the data matrix X (nxp) such that Y= TX=1 X1+ 2 X2+..+ p Xp • Where =(1 , 2 ,.., p)Tis a column vector of wheights with 1²+ 2²+..+ p²=1

  6. Maximize the variance of the projection of the observations on the Y variables • Find  so that Var(T X)= T Var(X)  is maximal • Var(X) is the covariance matrix of the Xivariables

  7. Good Better

  8. Eigen Vector and Eigen Value

  9. PCA

  10. Covariance matrix

  11. And so.. We find that • The direction of  is given by the eigenvector 1 correponding to the largest eigenvalue of matrix Σ . • The second vector that is orthogonal (uncorrelated) to the first is the one that has the second highest variance which comes to be the eignevector corresponding to the second eigenvalue • And so on …

  12. So PCA gives • New variables Yi that are linear combination of the original variables (xi): • Yi= ei1x1+ei2x2+…eipxp ; i=1..p • The new variables Yiare derived in decreasing order of importance; • they are called ‘principal components’

  13. Scale before PCA • PCA is sensitive to scale • PCA shouldbeapplied on data that have approximately the samescale in each variable

  14. Johnson RA and Wichern DW. Applied multivariate Statistical Analysis. Pearson Education, 2003

  15. How many PCAs to keep

  16. SVD (singular value decomposition) Johnson RA and Wichern DW. Applied multivariate Statistical Analysis. Pearson Education, 2003

  17. SVD Berrar DP and Dubitzky GM. A Practical Approach to Microarray Data Analysis. Springer 2003.

  18. SVD and PCA

  19. PCA application: genomic study • Population stratification: allele frequency differences between cases and controls due to systematic ancestry differences—can cause spurious associations in disease studies. • PCA could be used to infer underlying population structure.

  20. Figure 2Nature Genetics 38, 904 - 909 (2006) Principal components analysis corrects for stratification in genome-wide association studies Alkes L Price, Nick J Patterson, Robert M Plenge, Michael E Weinblatt, Nancy A Shadick & David Reic

  21. Chao Tian, Peter K. Gregersen and Michael F. Seldin. (2008) Accounting for ancestry: population substructure and genome-wide association studies.

  22. 1.1 PCA PCA, Case study Transcriptional regulation and function during the human cell cycle, Cho et al. (2001) Nature Genetics Vol 27, 48-54 -- to identify cell-cycle–regulated transcripts in human cells -- Primary fibroblasts prepared from human foreskin were grown to approximately 30% confluence and synchronized in late G1 using a double thymidine-block protocol9. Cultures were then released from arrest, and cells were collected every 2 hours for 24 hours, covering nearly 2 complete cell cycles. -- Messenger RNA was isolated, labeled and hybridized to sets of arrays containing probes for approximately 40,000 human genes and non-overlapping ESTs. We carried out the entire synchronization experiment in duplicate under identical conditions for 6,800 genes on Affy array. The two data sets were averaged and analyzed using both supervised and unsupervised clustering of expression patterns.

  23. 1.1 PCA PCA, Case study Un-synchronized

  24. 1.1 PCA

  25. 1.1 PCA PCA projection: 387 genes in 13-dim space (time points) are projected to 2D spaceusing correlation matrix; Gene phase 1: G1; 4: S; 2: G2; 3: M

  26. 1.1 PCA Variance in data explained by the first n principle components

  27. 1.1 PCA The weights of the 13 principle directions

  28. 1.1 PCA PCA projection: 13 samples (time points) in 387-dim space (genes) are projected to 2D spaceusing correlation matrix; Each sample is labeled by its time point

  29. 1.1 PCA Potential pitfalls of PCA: Principal components do not always capture important information needed. PCA projection from 2D to 1D: Cluster information will be lost.

  30. 1.2 Multidimensional scaling (MDS) Suppose we are giving the distance structure of the following 10 cities. And we have no knowledge of the city location/map of the US. Can we map these cities to a 2D space to best present their distance structure?

  31. 1.2 Multidimensional scaling (MDS) • MDS deals with the following problem: for a set of observed similarities (or distances) between every pair of N items, find a representation of the items in few dimensions such that the interitem proximities “nearly match” the original similarities (or distance). • The numerical measure of how close the original distances and the distances at lower dimensional coordinate is called stress.

  32. 1.2 MDS

  33. 1.2 MDS

  34. 1.2 MDS Mapping to 3D is possible but more difficult to visualize and interpret.

  35. 1.2 MDS • MDS attempts to map objects to a visible 2D or 3D Euclidean space. The goal is to best preserve the distance structure after the mapping. • The original data can be of high-dimensional or even non-metric space. The method only cares the distance (dissimilarity) structure. • The resulting mapping is not unique. Any rotation or reflection of a mapping solution is also a solution. • It could be shown that the results of PCA are exactly those of classical MDS if the distances calculated from the data matrix are Euclidean.

  36. 2. Microarray visualization Data matrix Data: X={xij}nd , ann (genes) d (samples) matrix.

  37. 2. Microarray visualization Heatmap Log-ratio of the target sample to reference sample. log(target/reference) Gradient color of RED: positive; GREEN: negative; BLACK: 0. LIGHT GREY: missing value.

  38. 2. Microarray visualization Treeview software developed by Mike Eisen

  39. 3. Software for dimension reduction & visualization PCA in R: prcomp(stats) Principal Components Analysis (preferred) princomp(stats) Principal Components Analysis screeplot(stats) Screeplot of PCA Results PCA in IMSL (a commercial C library) MDS in R: isoMDS(MASS) Kruskal's Non-metric Multidimensional Scaling cmdscale(stats) Classical (Metric) Multidimensional Scaling sammon(MASS) Sammon's Non-Linear Mapping MDS: Various software and resources about MDS http://www.granular.com/MDS/ Heatmap visualization: Treeview http://rana.lbl.gov/EisenSoftware.htm

More Related