1 / 25

Compressed Memory Hierarchy

Compressed Memory Hierarchy. Dongrui SHE Jianhua HUI. The research paper: . A compressed memory hierarchy using an indirect index cache . By Erik G. Hallnor and Steven K. Reinhardt Advanced Computer Architecture Laboratory EECS Department University of Michigan. Outline. Introduction

tilly
Download Presentation

Compressed Memory Hierarchy

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Compressed Memory Hierarchy Dongrui SHE Jianhua HUI

  2. The research paper: • A compressed memory hierarchy using an indirect index cache. By Erik G. Hallnor and Steven K. Reinhardt Advanced Computer Architecture Laboratory EECS Department University of Michigan

  3. Outline • Introduction • Memory eXpansion Technology • Cache-compression • IIC & IIC-C • Evaluation • Summary

  4. Introduction Memory capacity and Memory bandwidth • The amount of cache cannot be increased without bound; • Scarce resource: memory bandwidth;

  5. Application of data compression • First, adding a compressed main memory system (Memory Expansion Technology, MXT) • Second, Storing compressed data in the cache, then data be transmitted in compressed form between main memory and cache

  6. A key challenge • Management of variable-sized data blocks: 128-byte Block After compression, 58 bytes unused

  7. Outline • Introduction • Memory eXpansion Technology(MXT) • Cache-compression • IIC & IIC-C • Evaluation • Summary

  8. Memory eXpansion Technology • A server class system with hardware compressed main memory. • Using LZSS compression algorithm. For most applications, two to one compression (2:l). • Hardware compression of memory has a negligible performance penalty.

  9. Hardware organization • Sector translation table Each entry has 4 physical addr that each points to a 256B sector.

  10. Outline • Introduction • Memory eXpansion Technology(MXT) • Cache-compression • IIC & IIC-C • Evaluation • Summary

  11. Cache compression • Most designs for power savings, using more conventional cache structures: unused storage benefits only by not consuming power. • To use the space freed by compression, new cache structure is needed.

  12. Outline • Introduction • Memory eXpansion Technology(MXT) • Cache-compression • IIC & IIC-C • Evaluation • Summary

  13. Conventional Cache Structure • Tag associated statically with a block • When data is compressed

  14. Solution: Indirect Index Cache • A tag entry not associated with a particular data block • A tag entry contains a pointer to data block

  15. IIC structure • The cache can be fully associative

  16. Extend IIC to compressed data • Tag contains multiple pointers to smaller data blocks

  17. Generational Replacement • Software-managed • Blocks grouped into prioritized pools based on frequency • Victim is chosen from lowest-priority non-empty pool

  18. Additional Cost • Compression/decompression engine • More space for the tag entries • Extra resource for replacement algorithm • Area is roughly 13% larger

  19. Outline • Introduction • Memory eXpansion Technology(MXT) • Cache-compression • IIC & IIC-C • Evaluation • Summary

  20. Evaluation Method: SPEC CPU2000 Benchmarks: • Main memory: 150 cycle latency, bus width 32, with MXT • L1: 1 cycle latency, split 16KB, 4-way, 64B block size • L2:12 cycle latency, unified 256KB, 8-way,128B block size • L3:26 cycle latency, unified 1MB,8-way,128B block size, with IIC-C

  21. Evaluation • lsd Over 50% gain with only 10% area overhead

  22. Evaluation

  23. Summary Advantages: • Increase Effective Capacity & Bandwidth; • Power Saving From Less Memory Access Drawbacks: • Increase Hardware Complexity • Power Consumption of Additional Hardware

  24. Future work • Overall power consumption study • Use it in embedded system

  25. END Thank you ! Question time.

More Related