1 / 44

Principles of Computer Architecture Miles Murdocca and Vincent Heuring Chapter 7: Memory

This chapter explores the memory hierarchy, including RAM, chip organization, commercial memory modules, ROM, cache memory, virtual memory, advanced topics, and case studies.

ddecoteau
Download Presentation

Principles of Computer Architecture Miles Murdocca and Vincent Heuring Chapter 7: Memory

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Principles of Computer ArchitectureMiles Murdocca and Vincent HeuringChapter 7: Memory

  2. Chapter Contents • 7.1 The Memory Hierarchy • 7.2 Random Access Memory • 7.3 Chip Organization • 7.4 Commercial Memory Modules • 7.5 Read-Only Memory • 7.6 Cache Memory • 7.7 Virtual Memory • 7.8 Advanced Topics • 7.9 Case Study: Rambus Memory • 7.10 Case Study: The Intel Pentium Memory System

  3. The Memory Hierarchy

  4. Functional Behavior of a RAM Cell

  5. Simplified RAM Chip Pinout

  6. A Four-Word Memory with Four Bits per Word in a 2D Organization

  7. A Simplified Representation of the Four-Word by Four-Bit RAM

  8. 2-1/2D Organization of a 64-Word by One-Bit RAM

  9. Two Four-Word by Four-Bit RAMs are Used in Creating a Four-Word by Eight-Bit RAM

  10. Two Four-Word by Four-Bit RAMs Make up an Eight-Word by Four-Bit RAM

  11. Single-In-Line Memory Module • • Adapted from(Texas Instruments, MOS Memory: Commercial and Military Specifications Data Book, Texas Instruments, Literature Response Center, P.O. Box 172228, Denver, Colorado, 1991.)

  12. A ROM Stores Four Four-Bit Words

  13. A Lookup Table (LUT) Implements an Eight-Bit ALU

  14. Placement of Cache in a Computer System • • The locality principle: a recently referenced memory location is likely to be referenced again (temporal locality); a neighbor of a recently referenced memory location is likely to be referenced (spatial locality).

  15. An Associative Mapping Scheme for a Cache Memory

  16. Associative Mapping Example • • Consider how an access to memory location (A035F014)16 is mapped to the cache for a 232 word memory. The memory is divided into 227 blocks of 25 = 32 words per block, and the cache consists of 214 slots: • • If the addressed word is in the cache, it will be found in word (14)16 of a slot that has tag (501AF80)16, which is made up of the 27 most significant bits of the address. If the addressed word is not in the cache, then the block corresponding to tag field (501AF80)16 is brought into an available slot in the cache from the main memory, and the memory reference is then satisfied from the cache.

  17. Replacement Policies • • When there are no available slots in which to place a block, a replacement policy is implemented. The replacement policy governs the choice of which slot is freed up for the new block. • • Replacement policies are used for associative and set-associative mapping schemes, and also for virtual memory. • • Least recently used (LRU) • • First-in/first-out (FIFO) • • Least frequently used (LFU) • • Random • • Optimal (used for analysis only – look backward in time and reverse-engineer the best possible strategy for a particular sequence of memory references.)

  18. A Direct Mapping Scheme for Cache Memory

  19. Direct Mapping Example • • For a direct mapped cache, each main memory block can be mapped to only one slot, but each slot can receive more than one block. Consider how an access to memory location (A035F014)16 is mapped to the cache for a 232 word memory. The memory is divided into 227 blocks of 25 = 32 words per block, and the cache consists of 214 slots: • • If the addressed word is in the cache, it will be found in word (14)16 of slot (2F80)16, which will have a tag of (1406)16.

  20. A Set Associative Mapping Scheme for a Cache Memory

  21. Set-Associative Mapping Example • • Consider how an access to memory location (A035F014)16 is mapped to the cache for a 232 word memory. The memory is divided into 227 blocks of 25 = 32 words per block, there are two blocks per set, and the cache consists of 214 slots: • • The leftmost 14 bits form the tag field, followed by 13 bits for the set field, followed by five bits for the word field:

  22. Cache Read and Write Policies

  23. Hit Ratios and Effective Access Times • • Hit ratio and effective access time for single level cache: • • Hit ratios and effective access time for multi-level cache:

  24. Direct Mapped Cache Example • • Compute hit ratio and effective access time for a program that executes from memory locations 48 to 95, and then loops 10 times from 15 to 31. • • The direct mapped cache has four 16-word slots, a hit time of 80 ns, and a miss time of 2500 ns. Load-through is used. The cache is initially empty.

  25. Table of Events for Example Program

  26. Calculation of Hit Ratio and Effective Access Time for Example Program

  27. Neat Little LRU Algorithm • • A sequence is shown for the Neat Little LRU Algorithm for a cache with four slots. Main memory blocks are accessed in the sequence: 0, 2, 3, 1, 5, 4.

  28. Overlays • • A partition graph for a program with a main routine and three subroutines:

  29. Virtual Memory • • Virtual memory is stored in a hard disk image. The physical memory holds a small number of virtual pages in physical page frames. • • A mapping between a virtual and a physical memory:

  30. Page Table • • The page table maps between virtual memory and physical memory.

  31. Using the Page Table • • A virtual address is translated into a physical address:

  32. Using the Page Table (cont’) • • The configuration of a page table changes as a program executes. • • Initially, the page table is empty. In the final configuration, four pages are in physical memory.

  33. Segmentation • • A segmented memory allows two users to share the same word processor code, with different data spaces:

  34. Fragmentation • • (a) Free area of memory after initial-ization; (b) after fragment-ation; (c) after coalescing.

  35. Translation Lookaside Buffer • • An example TLB holds 8 entries for a system with 32 virtual pages and 16 page frames.

  36. 3-Variable Decoder • • A conventional decoder is not extensible to large sizes because each address line drives twice as many logic gates for each added address line.

  37. Tree Decoder - 3 Variables • • A tree decoder is more easily extended to large sizes because fan-in and fan-out are managed by adding deeper levels.

  38. Tree Decoding – One Level at a Time • • A decoding tree for a 16-word random access memory:

  39. Content Addressable Memory – Addressing • • Relationships between random access memory and content addressable memory:

  40. Overview of CAM • • Source: (Foster, C. C., Content Addressable Parallel Processors, Van Nostrand Reinhold Company, 1976.)

  41. Addressing Subtrees for a CAM

  42. Block Diagram of Dual-Read RAM

  43. Rambus Memory • • Rambus technology on the Nintendo 64 motherboard (top left and bottom right) enables cost savings over the conventional Sega Saturn motherboard design (bottom left). (Photo source: Rambus, Inc.)

  44. The Intel Pentium Memory System

More Related