1 / 41

Iterative Soft-Decision Decoding of Algebraic-Geometric Codes

Iterative Soft-Decision Decoding of Algebraic-Geometric Codes. Li Chen Associate Professor School of Information Science and Technology, Sun Yat-sen University, Guangzhou, China chenli55@mail.sysu.edu.cn website: sist.sysu.edu.cn/~chenli

chas
Download Presentation

Iterative Soft-Decision Decoding of Algebraic-Geometric Codes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Iterative Soft-Decision Decoding of Algebraic-Geometric Codes • Li Chen • Associate Professor • School of Information Science and Technology, Sun Yat-sen University, Guangzhou, China • chenli55@mail.sysu.edu.cn • website: sist.sysu.edu.cn/~chenli • Institute of Network Coding and Department of Information Engineering, the Chinese University of Hong Kong • 1st of Aug, 2012

  2. Outline • Introduction (How to construct an algebraic-geometric code?) • Review on Koetter-Vardy list decoding (Challenges in the decoding) • Iterative soft-decision decoding (An iterative solution) • Geometric interpretation of the iterative decoding (An insight into the solution) • Complexity reduction decoding approaches (Some implementation advises) • Performance analysis (Advantage and cost) • Conclusions (An end & a beginning)

  3. I. Introduction • The construction of an algebraic-geometric (AG) code • Based on an algebraic curve χ(x, y, z) • Identify its point of infinity p∞ define the pole basis Φ • Pick up one of the affine components, e.g., χ(x, y, 1)  find out the affine points pj • The Reed-Solomon (RS) code is the simplest AG code • Constructed based on y = 0; • Its pole basis Φ = {1, x, x2, x3, x4, ……} • Affine points {x1, x2, x3, …., xn} \ {0} • Note: the length of the code cannot exceed the size of the finite field. The generator matrix G The parity-check matrix H

  4. I. Introduction • The Hermitian curve: , , • The point of infinity The pole basis Bivariate monomials and their pole orders • Based on one of its affine components Hw(x, y, 1), determine the affine points pj = (xj, yj, 1) where xjw+1 + yjw + yj = 0 and j = 1, 2, …, n. • Encoding of an (n, k) Hermitian code • Given the message vector • The codeword is generated by • Note > q The length of the code can exceed the size of the finite field!

  5. I. Introduction • Example: Construction of a (8, 4) Hermitian code • Defined in = {0, 1, α, α2}; • The Hermitian curve H2(x, y, z) = x3 + y2z + yz2 point of infinity p∞ = (0, 1, 0); One of its affine components: H2(x, y, 1) = x3 + y2 + y; • Its pole basis • Its affine points: p1 = (0, 0), p2 = (0, 1), p3 = (1, α), p4 = (1, α2), p5 = (α, α), p6 = (α, α2), p7 = (α2,α), p8 = (α2, α2)

  6. I. Introduction • Advantage: Over the same finite field, the Hermitian codes are larger than the RS codes; • Code length vs. field size • Disadvantage: It is not a Maximum Distance Separable (MDS) code. The error-correction capability of a very high rate code is almost vanished.

  7. II. Review on KV list decoding • Decoding philosophy evolution List decoding Unique decoding The Guruswami-Sudan (GS) algorithm The Koetter-Vardy (KV) algorithm The Sakata algorithm with majority voting

  8. II. Review on KV list decoding • Key processes: Reliability transform (Π M), Interpolation (construct Q(x, y, z)), Factorisation (find out the z-roots of Q) • Reliability transform and knowledge of M (Example: a (8, 4) Hermitian code) p1p2p3p4p5p6p7p8 C1 C2 C3 C4 C5 C6 C7 C8 R1R2R3R4R5R6R7R8 Encoding Channel E.g., interpolation will be performed w.r.t. (p5, 1) with a multiplicity of 2. 0 1 α α2 Reliability transform The number of interpolation constraints is

  9. II. Review on KV list decoding • Reliability-based codeword score and multiplicity-based codeword score • Given a codeword • Theorem 1 If , can be found by determining the z-roots of Q. • Theorem 2 If , can be found by determining the z-roots of Q. • The optimal decoding performance of the KV algorithm is dictated by Π. 0 1 = 2.14 α α2 0 1 = 5 α α2

  10. II. Review on KV list decoding • KV decoding performance of the (64, 39) Hermitian code • Challenge: Can we further improve the KV decoding performance for Hermitian codes? Π

  11. III. Iterative Soft-Decision Decoding • Decoding stages: • ABP: Adaptive belief propagation, to improve the reliability of Π; • KV: Koetter-Vardy list decoding, to find out the message vector ; • Decoding block diagram Π Π’ KV ABP

  12. III. Iterative Soft-Decision Decoding • Binary image of the parity-check matrix H • Let σ(x) = σ0 + σ1x + ∙∙∙ + σβxβ be the primitive polynomial of • The companion matrix of σ(x) is • Example: In , σ(x) = 1 + x + x2 and

  13. III. Iterative Soft-Decision Decoding • Is Hb suitable to be used for BP decoding? • Density of the matrix: 53.125%; • The number of short cycles ( ): 279; • We will have to reduce the density and eliminate parts of the short cycles!

  14. III. Iterative Soft-Decision Decoding • Bit reliability oriented Gaussian elimination on Hb; • Assume each coded by cj is BPSK modulated  symbol sj (j = 1, 2, …, N); • Given as the received vector; • The bit log-likelihood ratio (LLR) value is: and the LLR vector • Reliability of bit cj is determined by , E.g., Pr[c1 = 0 | y1] = 0.49 Pr[c1 = 1 | y1] = 0.51 Pr[c2 = 0 | y2] = 0.93 Pr[c2 = 1 | y2] = 0.07 |L(c1)| = 0.04 Bit c2 is more reliable! |L(c2)| = 2.59

  15. III. Iterative Soft-Decision Decoding • Bit reliability sorting: sort the bits in term of their reliabilities; • Refreshed bit indices that indicate • Let be the set of bit indices and . E.g., based on the above sorting outcome, let that collects all the N - K least reliable bit indices; • We could sort LLR vector as

  16. III. Iterative Soft-Decision Decoding • Perform Gaussian elimination w.r.t. the columns indicated by B, i.e., reduce column j1 to [1 0 0 ∙∙∙ 0]T; reduce column j2 to [0 1 0 ∙∙∙ 0]T; reduce column jN-K to [0 0 0 ∙∙∙ 1]T. • Gaussian elimination Hb Hb’ (density ; number of short cycles ) • Matrix Hb’ is more suitable to be used in the BP decoding. … N-K bits

  17. III. Iterative Soft-Decision Decoding • The conventional BP decoding based on Hb’ • Let hij denote the entry of matrix Hb’ • Define and • Initialization: with entries vij and uij, and • For each BP iteration Horizontal step (V U) Vertical step (U V) • After a number of BP iterations, update the bit LLR values as , where the extrinsic LLR is η is the damping factor.

  18. III. Iterative Soft-Decision Decoding • The updated LLR vector can be formed as • The updated bit LLR values are converted back into APP values by • They can then be used to generate the improved reliability matrix Π’ Reliability transform M Interpolation Factorization

  19. III. Iterative Soft-Decision Decoding • A work example: Iterative decoding of the (8, 4) Hermitian code Codeword (sym. wise) Codeword (bit wise) The received LLR vector is: : LLR values that give a wrong estimation on the bits L(cj) ≥ 0  cj = 0; L(cj) < 0  cj = 1 The original reliability matrix Πis: = 3.969 = 3.993 Based on Theorem 1, KV decoding will fail!

  20. III. Iterative Soft-Decision Decoding j0, j2, …, j15= 7, 10, 0, 12, 8, 3, 11, 13, 2, 14, 15, 4, 1, 6, 9, 5 • Sort the bits in an ascending order in terms of , yielding • Perform Gaussian elimination on those columns implied by B B Bc Density: 37.5% S. cycles: 112 Density: 53.125% S. cycles: 279

  21. III. Iterative Soft-Decision Decoding • Based on Hb’, perform 3 BP iterations, we have the updated LLR vector as • The updated reliability matrix Π’ becomes • For the ‘wrong’ LLR values ( ): we would like to change its sign, or reduce its magnitude; • For the ‘right’ LLR values: we would like to leave the sign unchanged and increase its magnitude; = 4.478 = 4.037 Based on Theorem 1, KV decoding will succeed!

  22. III. Iterative Soft-Decision Decoding • Why Gaussian elimination should be bit reliability oriented? Tanner graph 5/0 4/1 5/2 5/2 3/2 3/2 L’(c5) L’(c7) unreliable bits reliable bits

  23. III. Iterative Soft-Decision Decoding • How to improve the iterative decoding performance? • It is possible that reliable bits are wrongly estimated by their LLR values; • We can create different sets of bit indices B and let more bits’ corresponding cols. also fall into the identity submatrix of Hb’. • Example with the sorted bit indices being {7, 10, 0, 12, 8, 3, 11, 13, 2, 14, 15, 4, 1, 6, 9, 5} Gau. elimination B(1) {3, 11, 13, 2, 14, 7, 10, 0, 12, 8, 15, 4, 1, 6, 9, 5} Gau. elimination B(2) {15, 4, 1, 6, 9, 7, 10, 0, 12, 8, 3, 11, 13, 2, 14, 5} Gau. elimination B(3)

  24. III. Iterative Soft-Decision Decoding • A revisit of the decoding block diagram • Note if there are multiple matrix adaptations, the next bit reliability sorting will be performed based on the updated LLR vector ; • Multiple attempts of KV decoding result in an output list that contains all the message candidates. The Maximum Likelihood (ML) criterion is used to select one from the list.

  25. IV. Geometric Interpretation • Insight of why we need matrix adaptations before the BP decoding • Normalize the vector to the vector • Normalize L(cj) to Tj by the mapping function • A graphical look into the vector and the vector.

  26. IV. Geometric Interpretation When the codeword is not found When the codeword is found • When a codeword is found, Tj = 1 for j = 1, 2, …, N;

  27. IV. Geometric Interpretation • Objective of the BP decoding: Finding the vector that minimizes the potential function • The LLR update in the BP decoding can be seen as the T value update • Finding the estimated codeword using the BP algorithm can be seen as identifying the vertex at which the potential function is minimized.

  28. IV. Geometric Interpretation • The convergence behavior of the potential function of the (64, 39) Hermitian code • = -100

  29. V. Complexity Reduction • Decoding parameters • -- number of groups of unreliable bit indices • -- number of matrix adaptations (Gau. eliminations) • -- number of BP iterations • There are three types of computations required by the decoding • Binary operations (Gau. eliminations): • Floating point operations (BP iterations): • Finite field arithmetic operations (KV decodings): • With the iterative decoding parameters of • Binary operations: • Floating point operations: • Finite field arithmetic operations: × × ×

  30. V. Complexity Reduction • Reduce the deployment of the KV decoding steps • ABP-KV decoding block diagram • We could try to assess the quality of matrices Π’ and M. If they are not good enough to result in a possibly successful decoding, the following KV decoding process will NOT be carried out. M Π Π’ Intp. Fac. ABP Π’M

  31. V. Complexity Reduction • Reliability-based received word score • Multiplicity-based received word score • Example: the (8, 4) Hermitian code 0 1 α = 6.7 α2 0 1 = 15 α α2

  32. V. Complexity Reduction • Recall the two theorems for successful KV decoding • Theorem 1  If ( ) {KV can succeed;} • Theorem 2 If ( ) {KV can succeed;} • Lemma 3  If ( ) {KV cannot succeed;} • Lemma 4  If ( ) {KV cannot succeed;} Proof: Proof: M Π Π’ Intp. Fac. ABP Π’M

  33. V. Complexity Reduction • Complexity reduction for ABP-KV decoding of the (64, 52) Hermitian code • Decoding parameters = (10, 5, 2) • There are 50 KV decoding processes for each codeword frame

  34. V. Complexity Reduction • Other facilitated decoding approaches: • Parallel decoding: • Output validation: once is found, the iterative decoding will be terminated. ( ; )

  35. VI. Performance Analysis • Decoding parameters: the KV decoding output list size (l) and • The (8, 4) Hermitian code over the AWGN channel

  36. VI. Performance Analysis • The (64, 39) Hermitian code over the AWGN channel

  37. VI. Performance Analysis • The (64, 47) Hermitian code over the AWGN channel

  38. VI. Performance Analysis • The (64, 47) Hermitian code over the fast Rayleigh fading channel • Coherent detection with the knowledge of CSI

  39. VI. Performance Analysis • Herm. (64, 47) vs. RS (15, 11), over the AWGN channel

  40. VII. Conclusions • Revisit the construction of AG codes: pole basis + affine points; • Review the KV soft-decision list decoding algorithm: Π dependent; • Introduce an iterative soft-decision decoding algorithm for Hermitian codes: Adaptive Belief Propagation + KV list decoding; • ABP algorithm is bit reliability oriented  BP is also good for AG (RS) codes; • Geometric interpretation  necessity of performing parity-check matrix adaptation; • Complexity reduction : successive criteria to assess Π’ and M; parallel decoding; output validations; • Performance analysis shows a significant performance gain can be achieved (~ conventional algorithms; ~ RS codes).

  41. Acknowledgement Project: Advanced coding technology for future storage devices; ID: 61001094; From 2011. 1 to 2013. 12. National natural Science Foundation of China Thank you!

More Related