Asymptotically good binary code with efficient encoding & Justesen code - PowerPoint PPT Presentation

omer
asymptotically good binary code with efficient encoding justesen code n.
Skip this Video
Loading SlideShow in 5 Seconds..
Asymptotically good binary code with efficient encoding & Justesen code PowerPoint Presentation
Download Presentation
Asymptotically good binary code with efficient encoding & Justesen code

play fullscreen
1 / 42
Download Presentation
Asymptotically good binary code with efficient encoding & Justesen code
518 Views
Download Presentation

Asymptotically good binary code with efficient encoding & Justesen code

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Asymptotically good binary code with efficient encoding& Justesen code Tomer Levinboim Error Correcting Codes Seminar (2008)

  2. Outline • Intro • codes • Singleton Bound • Linear Codes • Bounds • Gilbert-Varshamov • Hamming • RS codes • Code Concatention • Examples • Wozencraft Ensemble • Justesen Codes

  3. Hamming Distance • Hamming Distance between • The Hamming Distance is a metric • Non negative • Symmetric • Triangle inequality =

  4. Weight • The weight (wt) of • Example (on board)

  5. Code • An (n,k,d)q code C is a function such that: • For every

  6. Code (parameters) • (n,k,d)q • Parameters • n – block length • k – information length • d – minimum distance (actually, a lower bound) • q – size of alphabet • |C| = qk or k=logq|C|

  7. Code (parameters div n) • Asymptotic view of parameters as n∞: • The rate • Relative minimum distance • Thus an (n,k,d)q can be written as (1,R,δ)q • Notation: (n,k,d)q vs. [n,k,d]q – latter reserved for linear code (soon)

  8. Trivial Code Example • FEC3 = write each bit three time • R = ? • d = ? • how many errors can we • Detect ? (d-1) • Correct ? t, where d=2t+1

  9. Goal • Would like to: • Maximize δ – correct more • Maximize R – send more information * conflicting goals - would like to be able to construct an [n,k,d]q code s.t. δ>0, R>0 and both are constant. • Minimize q – for practical reasons • Maximize number of codewords while minimizing n and keeping d large.

  10. Singleton Bound • Let C be an [n,k,d]q code then • k ≤ n – d + 1 equivalently • R ≤ 1 – δ + o(1) • Proof: project C to first k-1 coordinates • On Board

  11. Visual intuition • On board... • Ballq(x,r) • r:=d • r:=t (where d=2t+1) • Volq(n,r) = |Ballq(x,r)|

  12. Linear Codes

  13. Linear Codes • An [n,k,d]q code C:FqKFqn is linear when: • Fq is a field • C is linear function (e.g., matrix) • Linearity implies: • C(ax+by) = aC(x) + bC(y) • 0n member of C

  14. Linear Codes (example) • FEC3 • [3,1,3]2 • Hadamard – longest linear code • [n,logn, n/2]2 • e.g., - [8,3,4]2 • (H - Matrix representation on board) • Dimensions • Asymptotic behavior

  15. Linear Codes – minimum distance • Lemma: if C:FqKFqn is linear then Note: for clarity Cx means C(x) • Proof: • ≤ - trivial • ≥ - follows from linearity (on board)

  16. Reed-Solomon code • Idea: oversample a polynomial • Let q be prime power and Fq a finite field of size q. • Let k<n and fix n elements of Fq, • x1,x2,..xn • Given a message m=(c0..ck-1) interpret it has the coefficients of the polynomial p

  17. RS Codes • Thus (c0..ck-1) is mapped to (p(x1),..p(xn)) • Linear mapping (Vandermonde) • Using linearity, can show for x≠0  RS meet the Singleton bound • Proof: on board • (# of roots of a k-1 degree poly) • Encoding time

  18. Bounds

  19. Gilbert-Varshamov Bound Preliminaries • Binary Entropy • Stirling Implying that:

  20. Gilbert-Varshamov Bound Preliminaries • Using the binary entropy we obtain • On board

  21. Gilbert-Varshamov Boundbound statement • For every n and d<n/2 there is an (n,k,d)q (not necessarily linear) code such that: • In terms of rate and relative min-distance:

  22. Gilbert-Varshamov Bound Proof • On Board • Sketch of proof: • if C is maximal then: • And • Now use union bound and entropy to obtain result (we show for q=2, using binary entropy)

  23. GV-Bound • Gilbert proved this with a greedy construction • Varshamov proved for linear codes • proved using random generator matrices – most matrices are good error correcting codes

  24. Singleton / GV Plot 1 Singleton (upper) Gilbert-Varshamov (lower) 0.5 1

  25. Hamming Bound (Upper) • With similar reasoning to GV bound but using • For q=2 can show that

  26. Bounds plot *Madhu Sudan (Lecture 5, 2001)

  27. Code Concatenation

  28. Code Concatenation - Motivation • RS codes imply we can construct good [n,k,d]q codes for any q=pk • Practically would like to work with small q (2, 28) • Consider the “obvious” idea for binary code generated from C – simply convert each symbol from Σn to log2q, • What’s the problem with this approach ? (write the new code!)

  29. Code Concatenation • Due to Forney (1966) • Two codes: • Outer: Cout = [N,K,D]Q • Inner: Cin = [n,k,d]q • Inner code should encode each symbol of outer code  k = logqQ

  30. Code Concatenation • How does it work ? * Luca Trevisan (Lecture 2)

  31. Code Concatenation • What is the new code ? • dcon = dD Proof: • On board

  32. Code Concatenation (Examples) • Asymptotically • δ = ¼  • R=logn/2n  0 

  33. Good Codes • Can we “explicitly” build asymptotically good (linear) codes ? • asymptotically good = constant R, δ> 0 as n∞ • Explicit = polytime constructable / logspace constructible

  34. Asymptotically Good Codes

  35. Asymptotically Good Codes • GV tells us that most linear functions of a certain size are good error-correcting codes • Can find a good code in brute-force • Use brute force on inner-code, where the alphabet is exponentially smaller! • Do we really need to search ?

  36. Wozencraft Ensemble • Consider the following set of codes: such that (R=1/2) ( • Notice that (on board)

  37. Wozencraft Ensemble • Lemma: There exists an ensemble of codes c1,..cN of rate ½ where N = qk-1 such that for at least (1-ε)N value of i, the code Ci has distance dis.t. • Proof (on board), outline: • Different codes have only 0n in common • Let y=Cα(x), then, If wt(y)<d  y in Ball(0n, d)  there are at most Vol(n,d) “bad” codes • For large enough n=2k, we have Vol(n,d) ≤ εN

  38. Wozencraft Ensemble • Implications: • Can construct entire ensemble in O(2k)=O(2n) • There are many such good codes, but which one do we use ?

  39. Justesen Code • Concatenation of: • Cout - RS code over • a set of inner codes • Justesen Code: C* = Cout(C1, C2, .. CN) • Each symbol of Cout is encoded using a different inner code Cj • If RS has rate R C* has rate R/2

  40. Justesen Code - δ • Denote the outer RS code [N,K,D]Q • Claim: C* has relative distance

  41. Justesen Code Proof • Intuition: like regular concatenation, but εN bad codes. • for x≠y, the outer code induces S={j | xj≠yj}, • |S| ≥D • There are at most εN j’s such that Cj is bad and therefore at least |S|- εN ≥ D- εN ≥ (1-R- ε)N good codes • since RS implies D=N-(K-1) • Each good code has relative distance ≥ d • d* ≥ (1-R- ε)Nd

  42. Justesen Code • The concatenated code C* is an asymptotically good code and has a “super” explicit construction • Can take q=2 to get such a binary code