1 / 29

Coding and Algorithms for Memories: WOM Codes Overview

Explore coding and algorithms for managing memories like HDDs, flash memories, etc. Understand the interface between the physical level and the operating system. Learn about different types of memories and their practical applications.

latonyal
Download Presentation

Coding and Algorithms for Memories: WOM Codes Overview

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 236608 - Coding and Algorithms for MemoriesLecture 2 – WOM Codes

  2. Overview • Lecturer: EitanYaakobiyaakobi@cs.technion.ac.il, Taub 638 • Teaching Assistant: BesartDollmabesartdollma@cs.technion.ac.il • Lectures hours: Sundays 10:30-12:30 @ Taub 201 • Tutorial Hour: Sundays 12:30-13:30 @ Taub 201 • Course website:https://webcourse.cs.technion.ac.il/236608 • Office hours: Sundays 13:30-14:30 and/or other times (please contact by email before) • Final grade: • Class participation (10%) • Homeworks (50%) • Take home exam/final Homework + project (40%)

  3. What is this class about? Coding and Algorithms for Memories • Memories – HDDs, flash memories, and other non-volatile memories • Coding and algorithms – how to manage the memory and handle the interface between the physical level and the operating system • Both from the theoretical and practical points of view • Q: What is the difference between theory and practice?

  4. Memories • Volatile Memories – need power to maintain the information • Ex: RAM memories, DRAM, SRAM • Non-Volatile Memories – do NOT need power to maintain the information • Ex: HDD, optical disc (CD, DVD), flash memories • Q: Examples of old non-volatile memories?

  5. Optical Storage • First generation – CD (Compact Disc), 700MB • Second generation – DVD (Digital Versatile Disc), 4.7GB, 1995 • Third generation – BD (Blu-Ray Disc) • Blue ray laser (shorter wavelength) • A single layer can store 25GB, dual layer – 50GB • Supported by Sony, Apple, Dell, Panasonic, LG, Pioneer

  6. The Magnetic Hard Disk Drive “1” “0”

  7. DNA as Storage Media • Richard Feynman first proposed the use of macromolecules for storage “There is plenty of room at the bottom" • Church et al. (Science, 2012) and Goldman et al. (Nature, 2013) stored 739 KB of data in synthetic DNA, mailed it and recreated the original digital files

  8. Flash Memories 3 2 1 0

  9. SLC, MLC, and TLC Flash High Voltage High Voltage High Voltage SLC Flash MLC Flash TLC Flash 1 Bit Per Cell 2 States 2 Bits Per Cell 4 States 3 Bits Per Cell 8 States Low Voltage Low Voltage Low Voltage

  10. Flash Memories Programming Array of cells made from floating gate transistors Typical size can be 32×215 The cells are programmed by pulsing electrons via hot-electron injection

  11. Flash Memories Programming Array of cells made from floating gate transistors Typical size can be 32×215 The cells are programmed by pulsing electrons via hot-electron injection Each cell can have q levels, represented by different amounts of electrons In order to reduce a cell level, thee cell and its containing block must be reset to level 0 before rewriting – A VERY EXPENSIVE OPERATION

  12. Rewriting Codes • Array of cells, made of floating gate transistors • Each cell can store q different levels • Today, q typically ranges between 2 and 16 • The levels are represented by the number of electrons • The cell’s level is increased by pulsing electrons • To reduce a cell level, all cells in its containing block must first be reset to level 0A VERY EXPENSIVE OPERATION

  13. Rewriting Codes • Problem: Cannot rewrite the memory without an erasure • However… It is still possible to rewrite if only cells in low level are programmed

  14. Rewriting Codes Rewrite codes significantly reduce the number of block erasures Store 3 bits once Store 1 bit 7 times Store 4 bits once Store 1bit 15times

  15. Write-Once Memories (WOM) • Introduced by Rivest and Shamir, “How to reuse a write-once memory”, 1982 • The memory elements represent bits (2 levels) and are irreversibly programmed from ‘0’ to ‘1’ 1st Write 2nd Write

  16. Write-Once Memories (WOM) • Examples: 1st Write 2nd Write

  17. Write-Once Memories (WOM) • Introduced by Rivest and Shamir, “How to reuse a write-once memory”, 1982 • The memory elements represent bits (2 levels) and are irreversibly programmed from ‘0’ to ‘1’ Q:How many cells are required to write 100 bits twice? P1: Is it possible to do better…? P2: How many cells to write kbits twice? P3: How many cells to write kbits ttimes? P3’: What is the total number of bits that is possible to write in n cells in t writes? 1st Write 2nd Write

  18. Binary WOM Codes • k1,…,kt:the number of bits on each write • n cells and t writes • The sum-rate of the WOM code is R = ()/n • Rivest Shamir: R = (2+2)/3 = 4/3

  19. Definition: WOM Codes • Definition: An [n,t;M1,…,Mt] t-write WOM code is a coding scheme which consists of n cells and guarantees any t writes of alphabet size M1,…,Mt by programming cells from zero to one • A WOM code consists of t encoding and decoding maps Ei, Di, 1 ≤i≤ t • E1: {1,…,M1}  {0,1}n • For 2 ≤i≤ t, Ei: {1,…,Mi}×Im(Ei-1)  {0,1}nsuch that for all (m,c)∊{1,…,Mi}×Im(Ei-1), Ei(m,c) ≥ c • For 1 ≤i≤ t, Di: {0,1}n {1,…,Mi}such that for Di(Ei(m,c)) = m for all (m,c)∊{1,…,Mi}×Im(Ei-1) The sum-rate of the WOM code is R = ()/n Rivest Shamir: [3,2;4,4], R = (log4+log4)/3=4/3

  20. Definition: WOM Codes • There are two cases • The individual rates on each write must all be the same: fixed-rate • The individual rates are allowed to be different: unrestricted-rate • We assume that the write number on each write is known. This knowledge does not affect the rate • Assume there exists a [n,t;M1,…,Mt]t-write WOM code where the write number is known • It is possible to construct a [Nn+t,t;M1N,…,MtN]t-write WOM code where the write number is not-known so asymptotically the sum-rate is the same

  21. James Saxe’s WOM Code • [n,n/2-1; n/2,n/2-1,n/2-2,…,2] WOM Code • Partition the memory into twoparts of n/2 cells each • First write: • input symbol m∊{1,…,n/2} • program the ith cell of the 1st group • The ith write, i≥2: • input symbol m∊{1,…,n/2-i+1} • copy the first group to the second group • program the ith available cell in the 1st group • Decoding: • There is always one cell that is programmed in the 1st and not in the 2nd group • Its location, among the non-programmed cells, is the message value • Sum-rate: (log(n/2)+log(n/2-1)+ … +log2)/n=log((n/2)!)/n≈ (n/2log(n/2))/n ≈ (log n)/2

  22. James Saxe’s WOM Code • [n,n/2-1; n/2,n/2-1,n/2-2,…,2] WOM Code • Partition the memory into two parts of n/2 cells each • Example: n=8, [8,3; 4,3,2] • First write: 3 • Second write: 2 • Third write: 1 • Sum-rate: (log4+log3+log2)/8=4.58/8=0.57 0,0,0,0|0,0,0,0  0,0,1,0|0,0,0,0  0,1,1,0|0,0,1,0 1,1,1,0|0,1,1,0

  23. WOM Codes Constructions • Rivest and Shamir ‘82 • [3,2; 4,4] (R=1.33); [7,3; 8,8,8] (R=1.28); [7,5; 4,4,4,4,4] (R=1.42); [7,2; 26,26] (R=1.34) • Tabular WOM-codes • “Linear” WOM-codes • David Klaner: [5,3; 5,5,5] (R=1.39) • David Leavitt: [4,4; 7,7,7,7] (R=1.60) • James Saxe: [n,n/2-1; n/2,n/2-1,n/2-2,…,2] (R≈0.5*log n), [12,3; 65,81,64] (R=1.53) • Merkx ‘84 – WOM codes constructed with Projective Geometries • [4,4;7,7,7,7] (R=1.60), [31,10; 31,31,31,31,31,31,31,31,31,31] (R=1.598) • [7,4; 8,7,8,8] (R=1.69), [7,4; 8,7,11,8] (R=1.75) • [8,4; 8,14,11,8] (R=1.66), [7,8; 16,16,16,16, 16,16,16,16] (R=1.75) • Wu and Jiang ‘09 - Position modulation code for WOM codes • [172,5; 256, 256,256,256,256] (R=1.63), [196,6; 256,256,256,256,256,256] (R=1.71), [238,8; 256,256,256,256,256,256,256,256] (R=1.88), [258,9; 256,256,256,256,256,256,256,256,256] (R=1.95), [278,10; 256,256,256,256,256,256,256,256,256,256] (R=2.01)

  24. Capacity Region and Achievable Rates of Two-Write WOM codes

  25. The Entropy Function • How many vectors are there with at most a single 1? • How many bits is it possible to represent this way? • What is the rate? • How many vectors are there with at most k 1’s? • How many bits is it possible to represent this way? • What is the rate? • Is it possible to approximate the value ? • Yes! ≈ h(p), where p=k/n andh(p) = -plog(p)-(1-p)log(1-p): the Binary Entropy Function • h(p) is the information rate that is possible to represent when bits are programmed with prob. p n+1 log(n+1) log(n+1)/n log( ) log( )/n log( )/n log( )/n

  26. The Binary Symmetric Channel • When transmitting a binary vector every bit is in error with probability p • On average pn bits will be in error • The amount of information which is lost is h(p) • Therefore, the channel capacity is C(p)=1-h(p) • The channel capacity is an indication on the amount of rate which is lost, or how much is necessary to “pay” in order to correct the errors in the channel 1-p 0 0 p p 1-p 0 0

  27. The Capacity of WOM Codes • The Capacity Region for two writes C2-WOM={(R1,R2)|∃p∊[0,0.5],R1≤h(p), R2≤1-p} h(p) – the binary entropy functionh(p) = -plog(p)-(1-p)log(1-p) • The maximum achievable sum-rate ismaxp∊[0,0.5]{h(p)+(1-p)} = log3 achieved for p=1/3: R1 = h(1/3) = log(3)-2/3 R2 = 1-1/3 = 2/3 • Capacity region (Heegard’86, Fu and Han Vinck’99) Ct-WOM={(R1,…,Rt)| R1 ≤ h(p1), R2 ≤ (1–p1)h(p2),…, Rt-1≤ (1–p1)(1–pt–2)h(pt–1) Rt≤ (1–p1)(1–pt–2)(1–pt–1)} • The maximum achievable sum-rate is log(t+1)

  28. The Capacity of WOM Codes • The Capacity Region for two writes C2-WOM={(R1,R2)|∃p∊[0,0.5],R1≤h(p), R2≤1-p} h(p) – the entropy function h(p) = -plog(p)-(1-p)log(1-p) • The Capacity Region for t writes: Ct-WOM={(R1,…,Rt)| ∃p1,p2,…pt-1∊[0,0.5], R1≤ h(p1), R2 ≤ (1–p1)h(p2),…, Rt-1≤ (1–p1)(1–pt–2)h(pt–1) Rt≤ (1–p1)(1–pt–2)(1–pt–1)} • p1 - prob to prog. a cell on the 1st write: R1≤ h(p1) • p2 - prob to prog. a cell on the 2nd write (from the remainder): R2≤(1-p1)h(p2) • pt-1 - prob to prog. a cell on the (t-1)th write (from the remainder): Rt-1≤(1–p1)(1–pt–2)h(pt–1) • Rt≤ (1–p1)(1–pt–2)(1–pt–1) because(1–p1)(1–pt–2)(1–pt–1) cells weren’t programmed • The maximum achievable sum-rate is log(t+1)

  29. The Capacity of WOM Codes • Theorem: The Capacity Region for two writes is C2-WOM={(R1,R2)|∃p∊[0,0.5],R1≤h(p), R2≤1-p}

More Related