- 94 Views
- Uploaded on
- Presentation posted in: General

Review of Linear Algebra

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Review of Linear Algebra

Fall 2014

The University of Iowa

Tianbao Yang

- TA: Shiyao Wang
- Office hours: 3:30-5:00 pm Tu/Th
- Office Location: 201C
- Homework-1 is available on ICON

- Vectorand Matrix
- Operation on Matrices/Vectors
- Singular value decomposition
- Norms
- An Application in Text Analysis

- Scalar
- a real number: 7

- Vector
- one dimensional array
- representation: column vector
- representation: row vector

- Dimensionality or size:
- number of scalars

- Vector Space
- all vectors of the same dimension

- Two dimensional array
- Representation
- (i,j)-th element:
- A set of vectors
- vector: a special matrix

rows

columns

- Dimensionality or size
- m*n (m rows and n columns)

- Matrix Space:

- Vectorand Matrix
- Operation on Matrices/Vectors
- Singular value decomposition
- Norms
- An Application in Text Analysis

- Matrix addition:
- two matrices of the same size
- (i,j)-th element:

- Scalar multiplication:
- results in the same size

- Matrix subtraction:

- Multiplication of a row vector and a column vector
- Matrix Multiplication
- ,

- Transpose:
- (i,j)-the element:
- transpose of a column vector: row vector

- Rules:

- Square matrix:
- Symmetric matrix:
- Zero matrix
- all elements are zeros

- Identity Matrix:
- each column (or row) standard basis
- :

- (Square) Matrix Inverse
- similar to inverse of a scalar:
- inverse of a square matrix:
- if there exists:

Non-singular

- Trace of a square matrix:
- definition
- rules

- Vectorand Matrix
- Operation on Matrices/Vectors
- Singular value decomposition
- Norms
- An Application in Text Analysis

mm

mm

mn

mn

V is nn

V is nn

- A matrix:
- Singular Value Decomposition (SVD)
- The columns of are left singular vectors
- The columns of are right singular vectors
- is a diagonal matrix with singular values (positive values)

mm

mn

V is nn

- Illustration of SVD dimensions and sparseness

- Rank of a Matrix
- organize singular values in descending order
- the largest index that is non-zero

- Eigenvectors(for a square mm matrix S)
- Example

(right) eigenvector

eigenvalue

S = U * * UT

S = U * * UT

- This is generally true for symmetric square matrix
- Columns of U are eigenvectors of S
- Diagonal elements of are eigenvalues of S

S = U * * UT

mm

mn

V is nn

nn

nn

nn

- A symmetric matrix:
- Eigen-value Decomposition
- The columns of are eigen-vectors
- is a diagonal matrix with real eigen-values

mm

mn

V is nn

nn

nn

nn

- A symmetric matrix:
- Eigen-value Decomposition
- The columns of are eigen-vectors
- is a diagonal matrix with Positive eigen-values
- is a diagonal matrix with Non-negative eigen-values

- Vectorand Matrix
- Operation on Matrices/Vectors
- Singular value decomposition
- Norms
- An Application in Text Analysis

- inner product between two vectors
- Norm of a Vector: (Euclidean Norm, norm)

- Cauchy-Schwarz Inequality
- Triangle Inequality

- p-norm
- p = 1 norm
- p = 2 norm
- p = norm

- Inner Product between two matrices
- Norm of a Matrix (Frobenius norm)

- Induced Norm (operator norm):
- p=2, spectral norm: maximum singular value
- p=1, maximum absolute column sum
- p= , maximum absolute row sum

- Schatten Norm:
- p=1, trace norm (or nuclear norm)
- p=2, Frobenius norm
- p= , Spectral norm

- Solve the following problems

Loss

norm

- Vectorand Matrix
- Operation on Matrices/Vectors
- Singular value decomposition
- Norms
- An Application in Search Engine

- A database of Webpages
- A user-typed query
- generate a list of relevant webpages
- A ranking problem

https://www.facebook.com/

contain query words (LSI)

a lot of links to them (PageRank)

- webpage is a document
- document contains many terms (words)
- To represent a document
- collect all meaningful terms
- count the occurrence of each term in a document

- Term-Document Matrix

- Represent the query in the same way
- e.g. query: “computer system”

Query

0

0

1

0

1

0

0

0

0

0

0

0

- Retrieve Similar Documents
- Query
- Similarity
- inner product
- normalized inner product (cosine similarity)
- Assume A is column normalized and q is normalized

Concept

Concept

Rep. of Concepts in term space

Rep. of concepts in document space

- Latent Semantic Indexing
- SVD

- Low rank approximation:
- approximate matrix with the largest singular values and singular vectors

Rank-k approximation

- Why Low rank approximation:
- data compression: billions to thousands
- filter out noise

Rank-k approximation

X

X

X

X

X

X

X

X

Top three left singular vectors

-0.2214 -0.1132 0.2890

-0.1976 -0.0721 0.1350

-0.2405 0.0432 -0.1644

-0.4036 0.0571 -0.3378

-0.6445 -0.1673 0.3611

-0.2650 0.1072 -0.4260

-0.2650 0.1072 -0.4260

-0.3008 -0.1413 0.3303

-0.2059 0.2736 -0.1776

-0.0127 0.4902 0.2311

-0.0361 0.6228 0.2231

-0.0318 0.4505 0.1411

- Why Low rank approximation:
- data compression: billions to thousands
- filter out noise