1 / 16

Types of Parallelism

Types of Parallelism. Chapter 17 Justin Bellomi. Characterizations of Parallelism. Computer Architects characterize the type and amount of parallelism that a design has, instead of simply classifying it as Parallel or Non-Parallel.

xandy
Download Presentation

Types of Parallelism

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Types of Parallelism Chapter 17 Justin Bellomi

  2. Characterizations of Parallelism • Computer Architects characterize the type and amount of parallelism that a design has, instead of simply classifying it as Parallel or Non-Parallel. • Virtually all computer systems have some sort of parallelism. • Flynn defined a list of key characterizations using a system of names in 1996.

  3. Flynn Characterizations: Terms used to describe Parallel systems • Microscopic Vs. Macroscopic • Symmetric Vs. Asymmetric • Fine-grain Vs. Coarse-grain • Explicit Vs. Implicit

  4. Microscopic Vs. Macroscopic • “Parallelism is so fundamental that an architect cannot design a computer without thinking about parallel hardware” ( Comer 280 ). • Microscopic refers to aspects of Parallelism that are not especially visible. • Macroscopic refers to the use of parallelism as the basic design principal around a system.

  5. Microscopic • The term Microscopic is used to describe parallelism that is present in a system, but not necessarily visible. • To be more specific, microscopic parallelism refers to the use of parallel hardware within a specific component. • “Without parallel hardware, various components of a computer system cannot operate at high speed” ( Comer 280 ).

  6. Examples of Microscopic Parallelism: • ALU – Most ALUs perform integer computation by processing multiple bits at a time. An ALU can be designed to compute an XOR on a pair of integers in a single operation. • Registers – General Purpose registers in a CPU heavily use microscopic parallelism. Each bit in a register is implemented by a separate circuit, and parallel hardware is used to move data from the registers to the ALUs.

  7. Examples of Microscopic Parallelism: Physical Memory – fetch and store operations use hardware that is designed to transfer an entire word on each operation.

  8. Macroscopic Parallelism The term Macroscopic parallelism is used to characterize the use of parallelism across multiple, large-scale components of a computer system.

  9. Examples of Macroscopic Parallelism: • Multiple, Identical Processors – Advertised ‘dual-processor’ PCs contain two identical CPU chips. The hardware is designed to allow both chips to function at the same time. • Multiple, Dissimilar Processors – A system that uses special-purpose coprocessors. “For example, a computer optimized for high-speed graphics might have four displays attached, with a special graphics processor running each display” ( Comer 282 ).

  10. Symmetric Vs. Asymmetric • The term symmetric parallelism is used to characterize a design that uses multiple identical elements, such as processors, that have the ability to operate simultaneously. • Asymmetric parallelism would use elements that are not identical

  11. Symmetric parallelism An example of symmetric parallelism is a dual-processor PC, assuming the two processor are identical the resulting PC would be considered symmetric.

  12. Asymmetric parallelism A PC that has a graphics coprocessor and a math coprocessor is classified as using asymmetric parallelism because all of the processors can work simultaneously, but the components are all designed to do a different task.

  13. Fine-grain Vs. Coarse-grain Parallelism • Fine-grain parallelism refers to computers that provide parallelism on the level of single instructions and single data elements. • Coarse-grain parallelism refers to computers that deal with whole programs and large portions of data.

  14. Explicit Vs. Implicit Parallelism • Explicit parallelism requires a programmer to assume control of the parallel unit. • Implicit parallelism does not require the programmer to take control. • An example of explicit parallelism is a hardware lock on a section of data.

  15. Questions? • Microscopic Vs. Macroscopic • Symmetric Vs. Asymmetric • Fine-grain Vs. Coarse-grain • Explicit Vs. Implicit

  16. Works Cited: Comer, Douglas E. Essentials of Computer Architecture. Pearson Education, Inc. New Jersey: Upper Saddle River, 2005

More Related