1 / 31

Linear Algebra Review

Linear Algebra Review. CS479/679 Pattern Recognition Dr. George Bebis. n-dimensional Vector. An n -dimensional vector v is denoted as follows: The transpose v T is denoted as follows:. Inner (or dot) product.

fcouncil
Download Presentation

Linear Algebra Review

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Linear Algebra Review CS479/679 Pattern RecognitionDr. George Bebis

  2. n-dimensional Vector • An n-dimensional vector v is denoted as follows: • The transpose vTis denoted as follows:

  3. Inner (or dot) product • Given vT= (x1, x2, . . . , xn) and wT= (y1, y2, . . . , yn), their dot product defined as follows: (scalar) or

  4. k Orthogonal / Orthonormal vectors • A set of vectors x1, x2, . . . , xnis orthogonal if • A set of vectors x1, x2, . . . , xnis orthonormalif

  5. Linear combinations • A vector v is a linear combination of the vectors v1, ..., vkif: where c1, ..., ck are constants. • Example: vectors in R3 can be expressed as a linear combinations of unit vectors i = (1, 0, 0), j = (0, 1, 0), and k = (0, 0, 1)

  6. Space spanning • A set of vectors S=(v1, v2, . . . , vk ) spansome space W if every vector in W can be written as a linear combination of the vectors in S - The unit vectors i, j, and k span R3 w

  7. Linear dependence • A set of vectors v1, ..., vkare linearly dependentif at least one of them is a linear combination of the others: (i.e., vj does not appear on the right side)

  8. Linear independence • A set of vectors v1, ..., vkis linearly independent if no vector vj can be represented as a linear combination of the remaining vectors, i.e.,: Example: c1=c2=0

  9. Vector basis • A set of vectors v1, ..., vk forms a basisin some vector space W if: (1) (v1, ..., vk)span W (2) (v1, ..., vk)are linearly independent • Standard bases: R2 R3 Rn

  10. Matrix Operations • Matrix addition/subtraction • Add/Subtract corresponding elements. • Matrices must be of same size. • Matrix multiplication m x p q x p m x n n Condition: n = q

  11. Diagonal/Identity Matrix

  12. Matrix Transpose

  13. Symmetric Matrices Example:

  14. Determinants 2 x 2 3 x 3 (expanded along 1st column) n x n (expanded along kth column) Properties:

  15. Matrix Inverse • The inverse of a matrix A, denoted as A-1, has the property: A A-1 = A-1A = I • A-1 exists only if • Terminology • Singular matrix:A-1does not exist • Ill-conditioned matrix: A is “close” to being singular

  16. Matrix Inverse (cont’d) • Properties of the inverse:

  17. Matrix trace Properties:

  18. Rank of matrix • Equal to the dimension of the largest square sub-matrix of A that has a non-zero determinant. Example: has rank 3

  19. Rank of matrix (cont’d) • Alternative definition: the maximum number of linearly independent columns (or rows) of A. Example: i.e., rank is not 4!

  20. Rank of matrix (cont’d)

  21. Eigenvalues and Eigenvectors • The vector vis an eigenvector of matrix A and λ is an eigenvalue of A if: Geometric interpretation: the linear transformation implied by A can not change the direction of the eigenvectors v, only their magnitude. (assume v is non-zero)

  22. Computing λ and v • To find the eigenvalues λ of a matrix A, find the roots of the characteristic polynomial: Example:

  23. Properties of λ and v • Eigenvalues and eigenvectors are only defined for square matrices. • Eigenvectors are not unique (e.g., if vis an eigenvector, so is kv). • Suppose λ1, λ2, ..., λnare the eigenvalues of A, then:

  24. Matrix diagonalization • Given an n x n matrix A, find P such that: P-1AP=ΛwhereΛis diagonal • Solution: Set P = [v1v2 . . . vn], where v1,v2 ,. . . vnare the eigenvectors of A:

  25. Matrix diagonalization (cont’d) Example:

  26. Matrix diagonalization (cont’d) • If A is diagonalizable, then the corresponding eigenvectors v1,v2 ,. . . vnform a basis in Rn • If A is a symmetric matrix, its eigenvaluesare real and its eigenvectors are orthogonal.

  27. Are all n x n matrices diagonalizable? • An n x n matrix A is diagonalizable iff it has n linearly independent eigenvectors. • i.e., if P-1 exists, that is, rank(P)=n • Theorem: If the eigenvalues of A are all distinct, their corresponding eigenvectors are linearly independent (i.e., A is diagonalizable).

  28. Matrix decomposition • If A is diagonalizable, then A can be decomposed as follows:

  29. Matrix decomposition (cont’d) • Matrix decomposition can be simplified in the case of symmetric matrices (i.e., orthogonal eigenvectors): P-1=PT A=PDPT=

  30. Covariance Matrix Decomposition where Φ-1= ΦT

  31. Linear Transformations of random variables Uncorrelated!

More Related