1 / 1

A Comparison of Some Iterative Methods in Scientific Computing

A Comparison of Some Iterative Methods in Scientific Computing. Shawn Sickel Man-Cheng Yeung Jon Held Department of Mathematics. Introduction:

casta
Download Presentation

A Comparison of Some Iterative Methods in Scientific Computing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Comparison of Some Iterative Methods in Scientific Computing Shawn Sickel Man-Cheng Yeung Jon Held Department of Mathematics Introduction: Large sparse linera systems affect our lives from every direction. In High School, everybody is taught the Gaussian Elimination method. If a 1000x1000 matriz was solved using Gaussian Elimination, it would take 334,333,000 steps, which would take well more than 300 years to complete. In this presentation, I will compare and contrast the methods used today for solving large sparse linera equations. Discussion: The top graph, which compares the iteration speed between the Jacobian Method and the Gauss-Seidel Method, clearly shows that the Gauss-Seidel Method is better. Both of those methods require the magnitude of the largest number to be <1. Gauss-Seidel converges faster because the EigenValue is smaller than Jacobi’s. The lower the eigenvalue is, the iterations will converge faster. The second graph, which compares the Conjugate Gradient Method and the BiConjugate Gradient Method in solving a non-SPD matrix, shows that CG does not converge, but BiCG can. BiCG was made to be able to converge with any linear system, whereas CG was designed to solve SPD matrices only. The third graph compares the BiCgonjugate Gradient Method and the BiConjugate Gradient Stabilized Method. In this case, both equations converge. BiCG was made to converge with any linear system, but if the A-Transpose is not present, it cannot proceed. The BiCGSTAB was created for situations which lack such information. Purpose: How is the GE method outdated? Gaussian Elimination What is an Iterative Method? An Iterative Method uses educated guesses to find closer and more accurate guesses each step. Which algorithms are compared in this work? In this work, GS, Jacobi, Conjugate Gradient, BiConjugate Gradient, and BiConjugate Gradient Stabilized Methods are evaulated. Which Iterative Method is the fastest? Each Krylov Subspace method and each Basic Iterative Method has unique conditions that specifically limit the kinds of systems that may be solved. Acknowledgements: For the success in my paper, I would like to thank my fellow SRAPers and the SRAP staff for encouragement and support. I would not have been able to complete this research paper without the help and guidance of Man-Chung Yeung and Jon Held. They taught me how to use Matlab computer programming, they helped edit my paper, and they showed me how the different iterative methods work. Methods: My professor and graduate student guided me through learning the different methods, as well as scientific computing. First, I learned how to solve linear systems by hand using GE. Then I was introduced into using the GS and Jacobi method. Once I have experienced the time it takes to solve a simple 10x10 matriz with those, I learned that in real life, people deal with matrices everyday, some having dimensions of over millions by millions. Matlab 7.0 was a tool I was taught how to use. This programming tool was used to map these itration graphs, and solves for eigenvalues instantly.

More Related