1 / 42

# Asymptotically good binary code with efficient encoding - PowerPoint PPT Presentation

Asymptotically good binary code with efficient encoding & Justesen code. Tomer Levinboim Error Correcting Codes Seminar (2008). Outline. Intro codes Singleton Bound Linear Codes Bounds Gilbert-Varshamov Hamming RS codes Code Concatention Examples Wozencraft Ensemble Justesen Codes.

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.

## PowerPoint Slideshow about 'Asymptotically good binary code with efficient encoding' - omer

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

### Asymptotically good binary code with efficient encoding& Justesen code

Tomer Levinboim

Error Correcting Codes Seminar (2008)

• Intro

• codes

• Singleton Bound

• Linear Codes

• Bounds

• Gilbert-Varshamov

• Hamming

• RS codes

• Code Concatention

• Examples

• Wozencraft Ensemble

• Justesen Codes

• Hamming Distance between

• The Hamming Distance is a metric

• Non negative

• Symmetric

• Triangle inequality

=

• The weight (wt) of

• Example (on board)

• An (n,k,d)q code C is a function such that:

• For every

• (n,k,d)q

• Parameters

• n – block length

• k – information length

• d – minimum distance (actually, a lower bound)

• q – size of alphabet

• |C| = qk or k=logq|C|

• Asymptotic view of parameters as n∞:

• The rate

• Relative minimum distance

• Thus an (n,k,d)q can be written as (1,R,δ)q

• Notation: (n,k,d)q vs. [n,k,d]q – latter reserved for linear code (soon)

• FEC3 = write each bit three time

• R = ?

• d = ?

• how many errors can we

• Detect ? (d-1)

• Correct ? t, where d=2t+1

• Would like to:

• Maximize δ – correct more

* conflicting goals - would like to be able to construct an [n,k,d]q code s.t. δ>0, R>0 and both are constant.

• Minimize q – for practical reasons

• Maximize number of codewords while minimizing n and keeping d large.

• Let C be an [n,k,d]q code then

• k ≤ n – d + 1

equivalently

• R ≤ 1 – δ + o(1)

• Proof: project C to first k-1 coordinates

• On Board

• On board...

• Ballq(x,r)

• r:=d

• r:=t (where d=2t+1)

• Volq(n,r) = |Ballq(x,r)|

• An [n,k,d]q code C:FqKFqn is linear when:

• Fq is a field

• C is linear function (e.g., matrix)

• Linearity implies:

• C(ax+by) = aC(x) + bC(y)

• 0n member of C

• FEC3

• [3,1,3]2

• Hadamard – longest linear code

• [n,logn, n/2]2

• e.g., - [8,3,4]2

• (H - Matrix representation on board)

• Dimensions

• Asymptotic behavior

• Lemma: if C:FqKFqn is linear then

Note: for clarity Cx means C(x)

• Proof:

• ≤ - trivial

• ≥ - follows from linearity (on board)

• Idea: oversample a polynomial

• Let q be prime power and Fq a finite field of size q.

• Let k<n and fix n elements of Fq,

• x1,x2,..xn

• Given a message m=(c0..ck-1) interpret it has the coefficients of the polynomial p

• Thus (c0..ck-1) is mapped to (p(x1),..p(xn))

• Linear mapping (Vandermonde)

• Using linearity, can show for x≠0

 RS meet the Singleton bound

• Proof: on board

• (# of roots of a k-1 degree poly)

• Encoding time

Gilbert-Varshamov Bound Preliminaries

• Binary Entropy

• Stirling

Implying that:

Gilbert-Varshamov Bound Preliminaries

• Using the binary entropy we obtain

• On board

Gilbert-Varshamov Boundbound statement

• For every n and d<n/2 there is an (n,k,d)q (not necessarily linear) code such that:

• In terms of rate and relative min-distance:

• On Board

• Sketch of proof:

• if C is maximal then:

• And

• Now use union bound and entropy to obtain result (we show for q=2, using binary entropy)

• Gilbert proved this with a greedy construction

• Varshamov proved for linear codes

• proved using random generator matrices – most matrices are good error correcting codes

1

Singleton (upper)

Gilbert-Varshamov (lower)

0.5

1

• With similar reasoning to GV bound but using

• For q=2 can show that

• RS codes imply we can construct good [n,k,d]q codes for any q=pk

• Practically would like to work with small q (2, 28)

• Consider the “obvious” idea for binary code generated from C – simply convert each symbol from Σn to log2q,

• What’s the problem with this approach ? (write the new code!)

• Due to Forney (1966)

• Two codes:

• Outer: Cout = [N,K,D]Q

• Inner: Cin = [n,k,d]q

• Inner code should encode each symbol of outer code  k = logqQ

• How does it work ?

* Luca Trevisan (Lecture 2)

• What is the new code ?

• dcon = dD Proof:

• On board

• Asymptotically

• δ = ¼ 

• R=logn/2n  0 

• Can we “explicitly” build asymptotically good (linear) codes ?

• asymptotically good = constant R, δ> 0 as n∞

• Explicit = polytime constructable / logspace constructible

• GV tells us that most linear functions of a certain size are good error-correcting codes

• Can find a good code in brute-force

• Use brute force on inner-code, where the alphabet is exponentially smaller!

• Do we really need to search ?

• Consider the following set of codes:

such that (R=1/2) (

• Notice that (on board)

• Lemma: There exists an ensemble of codes c1,..cN of rate ½ where N = qk-1 such that for at least (1-ε)N value of i, the code Ci has distance dis.t.

• Proof (on board), outline:

• Different codes have only 0n in common

• Let y=Cα(x), then, If wt(y)<d

 y in Ball(0n, d)

 there are at most Vol(n,d) “bad” codes

• For large enough n=2k, we have Vol(n,d) ≤ εN

• Implications:

• Can construct entire ensemble in O(2k)=O(2n)

• There are many such good codes, but which one do we use ?

• Concatenation of:

• Cout - RS code over

• a set of inner codes

• Justesen Code: C* = Cout(C1, C2, .. CN)

• Each symbol of Cout is encoded using a different inner code Cj

• If RS has rate R C* has rate R/2

• Denote the outer RS code [N,K,D]Q

• Claim: C* has relative distance

• Intuition: like regular concatenation, but εN bad codes.

• for x≠y, the outer code induces S={j | xj≠yj},

• |S| ≥D

• There are at most εN j’s such that Cj is bad and therefore at least |S|- εN ≥ D- εN ≥ (1-R- ε)N good codes

• since RS implies D=N-(K-1)

• Each good code has relative distance ≥ d

• d* ≥ (1-R- ε)Nd

• The concatenated code C* is an asymptotically good code and has a “super” explicit construction

• Can take q=2 to get such a binary code