40 likes | 120 Views
This comprehensive guide introduces the key concepts in parallel computing, from computational models to algorithm design to performance evaluation. Learn about Flynn's Taxonomy, parallel computational models, granularity, speedup metrics, and efficiency analysis. Discover the importance of parallel and distributed computing in overcoming limitations of sequential systems for large-scale and real-time problems that benefit from distributed computation. Delve into Amdahl's and Gustafson's laws, and explore the reasons behind studying parallel and distributed computing.
E N D
Chapter 1 Introduction and General Concepts
References • Selim Akl, Parallel Computation: Models and Methods, Prentice Hall, 1997, Updated online version available through website. • Selim Akl, The Design of Efficient Parallel Algorithms, Chapter 2 in “Handbook on Parallel and Distributed Processing” edited by J. Blazewicz, K. Ecker, B. Plateau, and D. Trystram, Springer Verlag, 2000. • Ananth Grama, Anshul Gupta, George Karypis, and Vipin Kumar, Introduction to Parallel Computing, 2nd Edition, Addison Wesley, 2003. • Harry Jordan and Gita Alaghband, Fundamentals of Parallel Processing: Algorithms Architectures, Languages, Prentice Hall, 2003. • Michael Quinn, Parallel Programming in C with MPI and OpenMP, McGraw Hill, 2004. • Michael Quinn, Parallel Computing: Theory and Practice, McGraw Hill, 1994 • Barry Wilkenson and Michael Allen, Parallel Programming, 2nd Ed.,Prentice Hall, 2005.
Outline • Need for Parallel & Distributed Computing • Flynn’s Taxonomy of Parallel Computers • Two Main Types of MIMD Computers • Examples of Computational Models • Data Parallel & Functional/Control/Job Parallel • Granularity • Analysis of Parallel Algorithms • Elementary Steps: computational and routing steps • Running Time & Time Optimal • Parallel Speedup • Speedup • Cost and Work • Efficiency • Linear and Superlinear Speedup • Speedup and Slowdown Folklore Theorems • Amdahl’s and Gustafon’s Law
Reasons to Study Parallel & DistributedComputing • Sequential computers have severe limits to memory size • Significant slowdowns occur when accessing data that is stored in external devices. • Sequential computational times for most large problems are unacceptable. • Sequential computers can not meet the deadlines for many real-time problems. • Many problems are distributed in nature and natural for distributed computation