1 / 20

15-213 Recitation 6 Greg Reshko

This document provides a quick review of cache parameters and addressing, followed by detailed examples of direct-mapped, set associative, and fully associative caches. It includes step-by-step explanations and addresses common cache miss scenarios.

lsorensen
Download Presentation

15-213 Recitation 6 Greg Reshko

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 15-213 Recitation 6Greg Reshko Office Hours: Wed 2:00-3:00PM March 10th, 2003

  2. Outline • Exam 1 • Cache • Quick Review • Detailed Examples

  3. Exam 1 • Regrades are done • Any questions?

  4. Review: Addressing Address A: b bits t bits s bits Address A is in the cache if its tag matches one of the valid lines in the set associated with the set index of A m-1 0 <tag> <set index> <line offset> v tag 0 1 • • • B–1 • • • set s: v tag 0 1 • • • B–1

  5. Review: Parameters • B = 2b = line size • E = associativity • S = 2s = number of sets • Cache size = B × E × S • s = set index • b = byte offset • t = tag • m = address size • t + s + b = m Cache Parameters Address A: b bits t bits s bits 0 m-1 Line Parameters

  6. Simple example: cache parameters • 8 KB direct-map cache with 64 byte lines • word size is 32 bits • A direct-map cache has an associativity of 1 • Determine t, s, and b • B = 2b = 64, so b = 6 • B × E × S = C = 8192 (8 KB), and we know E = 1 • S = 2s = C / B = 128, so s = 7 • t = m – s – b = 32 – 6 – 7 = 19, so t = 19 t = 19 s = 7 b = 6 31 12 5 0

  7. Direct-Mapped Cache • Three steps to extract data from cache: • Set Selection • Use set index bits. • Line Matching • Easy, since one line per set. • Word Selection • Use block offset bits. What this means should be clear in a moment…

  8. Initially empty cache Address A: b bits t bits s bits 0 m-1 Cache Parameters: S=4, i.e. 4 sets E=1, i.e. 1 line per set B=2, i.e. 2 bytes per block m=4, i.e. 4-bit addresses S=22 => s=2 B=21 => b=1 t=m-s-b=4-2-1=1 => t=1 Determine cache parameters first…

  9. Read word at address 0 Address A: b bits t bits s bits 0 00 0 0 m-1 Address is 0 0 = 0 00 0 Therefore: Tag = 0 Set = 00 Block = 0 This is a cache miss, since V for this set is 0. Fetch m[0] AND m[1] and store it. Address Tag/Set/Block Cache Location

  10. Read word at address 1 Address A: b bits t bits s bits 0 00 1 0 m-1 Address is 1 0 = 0 00 1 Therefore: Tag = 0 Set = 00 Block = 1 This is a cache hit, since V for this set is 1. Nothing changes.

  11. Read word at address 13 Address A: b bits t bits s bits 1 10 1 0 m-1 Address is 13 0 = 1 10 1 Therefore: Tag = 1 Set = 10 Block = 1 This is a cache miss, since V for this set is 0. Fetch m[12] and m[13] and store it.

  12. Read word at address 8 Address A: b bits t bits s bits 1 00 0 0 m-1 Address is 8 0 = 1 00 0 Therefore: Tag = 1 Set = 00 Block = 0 This is a cache miss, since V for this set is 0. Since m[8] maps to the same place as m[0], we will overwrite m[0] (i.e. eviction). Fetch m[8] and m[9] and store it.

  13. Read word at address 0 Address A: b bits t bits s bits 0 00 0 0 m-1 Address is 0 0 = 0 00 0 Therefore: Tag = 0 Set = 00 Block = 0 This is a cache miss, since tags do not match. Since m[0] maps to the same place as m[8], we will overwrite m[8]. But we just stored m[0] there! This is called thrashing. Fetch m[0] and m[1] and store it.

  14. Set Associative Cache • Three steps to extract data from cache: • Set Selection • Same as direct-mapped. Use set index bits. • Line Matching • Check tag and valid bits for all lines in the set. • Word Selection • Same as direct-mapped. Use block offset bits. What this means should be clear in a moment…

  15. valid tag 0 1 • • • B–1 • • • set 0: valid tag 0 1 • • • B–1 valid tag 0 1 • • • B–1 • • • set 1: valid tag 0 1 • • • B–1 • • • valid tag 0 1 • • • B–1 • • • set S-1: valid tag 0 1 • • • B–1 1. Set Selection Address A: b bits t bits s bits m-1

  16. 2. Line Matching Address A: b bits t bits s bits m-1 valid tag 0 1 • • • B–1 Valid or not? • • • set 0: valid tag 0 1 • • • B–1 To access a location, tags must match and the entry must be valid…

  17. 3. Word Selection Address A: b bits t bits s bits m-1 valid tag 0 1 • • • B–1 • • • set 0: valid tag 0 1 • • • B–1

  18. How to actually access it? • Same idea • Convert Address to <Tag/Set/Block> • Convert <Tag/Set/Block> to Cache Location

  19. Fully Associative Cache • Three steps to extract data from cache: • Set Selection • There is only one set and hence no set bits. • Line Matching • Same as in set associative cache, except that there are many more lines per set now. • Word Selection • Same as in set associative cache. What this means should hopefully be clear by now.

  20. Conclusion • Good caching = faster code • Spatial locality (same address many times) • Temporal Locality (nearby locations) • Blocking (split one chunk into little pieces) • Etc

More Related