1 / 31

Krylov-Subspace Methods - II

Krylov-Subspace Methods - II. Lecture 7 Alessandra Nardi. Thanks to Prof. Jacob White, Deepak Ramaswamy, Michal Rewienski, and Karen Veroy. Last lectures review. Overview of Iterative Methods to solve Mx=b Stationary Non Stationary QR factorization Modified Gram-Schmidt Algorithm

barr
Download Presentation

Krylov-Subspace Methods - II

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Krylov-Subspace Methods - II Lecture 7 Alessandra Nardi Thanks to Prof. Jacob White, Deepak Ramaswamy, Michal Rewienski, and Karen Veroy

  2. Last lectures review • Overview of Iterative Methods to solve Mx=b • Stationary • Non Stationary • QR factorization • Modified Gram-Schmidt Algorithm • Minimization View of QR • General Subspace Minimization Algorithm • Generalized Conjugate Residual Algorithm • Krylov-subspace • Simplification in the symmetric case • Convergence properties • Eigenvalue and Eigenvector Review • Norms and Spectral Radius • Spectral Mapping Theorem

  3. Arbitrary Subspace MethodsResidual Minimization

  4. Arbitrary Subspace MethodsResidual Minimization Use Gram-Schmidt on Mwi’s!

  5. Krylov Subspace Methods Krylov Subspace kth order polynomial

  6. Krylov Subspace MethodsSubspace Generation The set of residuals also can be used as a representation of the Krylov-Subspace Generalized Conjugate Residual Algorithm Nice because the residuals generate next search directions

  7. Krylov-Subspace MethodsGeneralized Conjugate Residual Method (k-th step) Determine optimal stepsize in kth search direction Update the solution (trying to minimize residual) and the residual Compute the new orthogonalized search direction (by using the most recent residual)

  8. Krylov-Subspace MethodsGeneralized Conjugate Residual Method (Computational Complexity for k-th step) Vector inner products, O(n) Matrix-vector product, O(n) if sparse Vector Adds, O(n) O(k) inner products, total cost O(nk) If M is sparse, as k (# of iters) approaches n, Better Converge Fast!

  9. Summary • What is an iterative non stationary method: x(k+1)=x(k)+akpk • How search to calculate: • Search directions (pk) • Step along search directions (ak) • Krylov Subspace  GCR • GCR is O(k2n) • Better converge fast!  Now look at convergence properties of GCR

  10. Krylov Methods Convergence AnalysisBasic properties

  11. Krylov Methods Convergence AnalysisOptimality of GCR poly GCR Optimality Property Therefore Any polynomial which satisfies the constraints can be used to get an upper bound on

  12. Eigenvalues and eigenvectors reviewInduced norms Theorem: Any induced norm is a bound on the spectral radius Proof:

  13. Useful Eigenproperties Spectral Mapping Theorem Given a polynomial Apply the polynomial to a matrix Then

  14. Krylov Methods Convergence AnalysisOverview Matrix norm property GCR optimality property where is any (k+1)-th order polynomial subject to:  may be used to get an upper bound on

  15. Krylov Methods Convergence AnalysisOverview • Review on eigenvalues and eigenvectors • Induced norms: relate matrix eigenvalues to the matrix norms • Spectral mapping theorem: relate matrix eigenvalues to matrix polynomials • Now ready to relate the convergence properties of Krylov Subspace methods to eigenvalues of M

  16. Krylov Methods Convergence AnalysisNorm of matrix polynomials Cond(V)

  17. Krylov Methods Convergence AnalysisNorm of matrix polynomials

  18. Krylov Methods Convergence AnalysisImportant observations 1) The GCR Algorithm converges to the exact solution in at most n steps 2) If M has only q distinct eigenvalues, the GCR Algorithm converges in at most q steps

  19. Krylov Methods Convergence AnalysisConvergence for MT=M - Residual Polynomial If M = MT then 1) M has orthonormal eigenvectors 2) M has real eigenvalues

  20. Krylov Methods Convergence AnalysisResidual Polynomial Picture (n=10) 1 * = evals(M) - = 5th order poly - = 8th order poly

  21. Krylov Methods Convergence AnalysisResidual Polynomial Picture (n=10) Strategically place zeros of the poly

  22. Krylov Methods Convergence AnalysisConvergence for MT=M – Polynomial min-max problem

  23. Krylov Methods Convergence AnalysisConvergence for MT=M – Chebyshev solves min-max The Chebyshev Polynomial =

  24. Chebychev Polynomials minimizing over [1,10]

  25. Krylov Methods Convergence AnalysisConvergence for MT=M – Chebyshev bounds

  26. Krylov Methods Convergence AnalysisConvergence for MT=M – Chebyshev result

  27. Krylov Methods Convergence AnalysisExamples For which problem will GCR Converge Faster?

  28. Which Convergence Curve is GCR? Iteration

  29. Krylov Methods Convergence AnalysisChebyshev is a bound GCR Algorithm can eliminate outlying eigenvalues by placing polynomial zeros directly on them.

  30. Now we know Iterative Methods - CG Convergence is related to: • Number of distinct eigenvalues • Ratio between max and min eigenvalue Why ? How?

  31. Summary • Reminder about GCR • Residual minimizing solution • Krylov Subspace • Polynomial Connection • Review Eigenvalues • Induced Norms bound Spectral Radius • Spectral mapping theorem • Estimating Convergence Rate • Chebyshev Polynomials

More Related