1 / 105

Math 2999 presentation

Math 2999 presentation. Singular Value Decomposition and its applications. Yip Ka Wa (2009137234). 1. Mathematics foundation. Singular Value Decomposition.

lsegers
Download Presentation

Math 2999 presentation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Math 2999 presentation Singular Value Decomposition and its applications Yip Ka Wa (2009137234)

  2. 1. Mathematics foundation

  3. Singular Value Decomposition • Suppose that we are given a matrix, it is not known that whether it has any special properties like diagonal or normal, but we can always decompose it into a product of 3 matrices. This is called the singular value decomposition:

  4. The statement:

  5. The picture:

  6. General idea of construction (proof omitted)

  7. Positive semidefinite matrix

  8. Singular Value

  9. Spectral Decomposition of Hermitian matrices

  10. Recall:

  11. Compact form

  12. Example

  13. 2. Image compression

  14. Image compression

  15. Outer product expansion

  16. Truncation of terms • To ensure that the memory used is reduced, however, we can even do a bit more. We can delete some of the nonzero singular values and keep only the first k largest singular values out of the r nonzero singular values. r terms in the outer product expansion are reduced to k terms.

  17. Memory reduction

  18. Block-based SVD • Divide the image matrix into blocks and apply SVD compression to blocks.

  19. Result without block-based

  20. Special topic Investigation of the kind of images for whichlow rank SVD approximations can give very good results

  21. Quality of image: Error ratio:

  22. Complex? • Simple? • How do I know whether I can delete some of the singular values and still preserve quality?

  23. Some suggestions • 1. The difference (Euclidean norm) between two columns/rows is small compared to the Frobenius norm of the matrix. • 2. The difference (Euclidean norm) between columns/rows with zero columns/rows is small compared to the Frobenius norm of the matrix. • 3. The columns/rows are near to another columns/row in term of the cosine angle. The cosine angle is near to 1.

  24. An example illustrating the Eckart-Young theorem:

  25. Best rank 2 approximation

  26. Obvious rank 2 matrix:

  27. Frobenius norm difference

  28. Upper bound of singular values

  29. Sometimes we can easily spot a matrix of which the rank is 1 lower A, and observe that the Frobenius norm of the difference matrix between them is small, for example, a column is near to a multiple of another column. As that Frobenius norm is an upper bound for the last nonzero singular value of A, we know the last nonzero singular value of A is very small. Therefore it can be truncated to do SVD approximation without much loss of accuracy.

More Related