1 / 29

Instructor: Kai Zhang Fall 2019

Applications of Matrix-based Learning Algorithms. Instructor: Kai Zhang Fall 2019. Data Intensive Research. empirical. theoretical. simulation. data intensive. Symmetric Matrices. Kernel (Gram) matrix. Graph adjacency matrix. Kernel methods Support vector machines

meryl
Download Presentation

Instructor: Kai Zhang Fall 2019

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Applications of Matrix-based Learning Algorithms Instructor: Kai Zhang Fall 2019

  2. Data Intensive Research empirical theoretical simulation data intensive

  3. Symmetric Matrices • Kernel (Gram) matrix • Graph adjacency matrix • Kernel methods • Support vector machines • Kernel PCA, LDA, CCA • Gaussian process regression • Graph-based algorithms • Manifold learning, dimension reduction • Clustering, semi-supervised learning • Random walk, graph propagation Manipulating matrix time, space!

  4. Universal Applicability Low-rank Approximation Matrix Decomp. Comes Into Play! • Reduce problem size, complexity • Structure mining • Fundamental building block in modern computing • Information Technology (information retrieval, recommender systems) • Computer Vision (image processing, face recognition, video surveillance) • Computer Graphics (3D reconstruction, relighting, rendering) • Bioinformatics(clustering, network analysis, gene expression analysis) • Deep Networks (model compression, energy saving) • Optimization (linear systems, newton method) Data gathered from 1998 12th Release: 4 million spectra

  5. More Applications • Wide applications of matrix (singular-value) decomposition

  6. Graph kernel • Motivation A Toxic Non-toxic B E D B A C C A E B D Unknown B D C Known C A E Task: predict whether molecules are toxic, given set of known examples D F

  7. Manifold Learning • Manifold Learning (or non-linear dimensionality reduction) embeds data that originally lies in a high dimensional space in a lower dimensional space, while preserving characteristic properties.  • a manifold is a topological space that locally resembles Euclidean space near each point.

  8. Mapmaking problem A B B A Earth (sphere) Planar map

  9. Image Examples Objective: to find a small number of features that represent a large number of observed dimensions. For each image: there are 64x64 = 4096 pixels (observed dimensions) Assumption: High-dimensional data often lies on or near a much lower dimensional, curved manifold.

  10. Visualization of 6,000 digits from the MNIST dataset produced by t-SNE.

  11. The COIL20 dataset Each object is rotated about a vertical axis to produce a closed one-dimensional manifold of images.

  12. Visualization of COIL20 produced by t-SNE.

  13. Time Series Embedding Example • BOLD (Blood-oxygen-level dependent )signal Signal of a single brain region Signal of altogether 90 brain regions

  14. Impact of perplexity Perplexity controls the range of the neighborhood used in computing the probability distribution

  15. Computer graphics Applications • Rendering on meshed surfaces

  16. Spring networks View edges as rubber bands or ideal linear springs Nail down some vertices, let rest settle potential energy is When stretched to length

  17. Spring networks Nail down some vertices, let rest settle Physics: position minimizes total potential energy subject to boundary constraints (nails)

  18. Drawing by Spring Networks (Tutte’63)

  19. Drawing by Spring Networks (Tutte’63)

  20. Drawing by Spring Networks (Tutte’63)

  21. Drawing by Spring Networks (Tutte’63)

  22. Drawing by Spring Networks (Tutte’63)

  23. Drawing by Spring Networks (Tutte’63) If the graph is planar, then the spring drawing has no crossing edges!

  24. Drawing by Spring Networks (Tutte’63)

  25. Drawing by Spring Networks (Tutte’63)

  26. Drawing by Spring Networks (Tutte’63)

  27. Drawing by Spring Networks (Tutte’63)

  28. Drawing by Spring Networks (Tutte’63)

  29. Unsupervised feature learning with a neural network x1 x1 • Network is trained to output the input (learn identify function). • Trivial solution unless: • Constrain number of units in Layer 2 (learn compressed representation), or • Constrain Layer 2 to be sparse. x2 x2 x3 x3 x4 x4 a1 x5 x5 a2 +1 x6 x6 Layer 2 a3 Layer 3 Layer 1 +1 Encoding Decoding

More Related