1 / 26

Fountain Codes, LT Codes and Raptor Codes

Fountain Codes, LT Codes and Raptor Codes. Susmita Adhikari Eduard Mustafin G ö khan G ü l. Outline. 1 . Motivation. 2. Fountain Codes. 3. Degree Distribution. 4. LT Codes. 5 . Raptor Codes. 7 . Conclusion. p. Motivation. 1-p. 0. 0. e. p. 1. 1. 1-p. Capacity = (1 - p).

elsa
Download Presentation

Fountain Codes, LT Codes and Raptor Codes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Fountain Codes, LT Codes andRaptor Codes Susmita Adhikari Eduard Mustafin Gökhan Gül

  2. Outline 1. Motivation 2. Fountain Codes 3. Degree Distribution 4. LT Codes 5. Raptor Codes 7. Conclusion Kiel, February 2008

  3. p Motivation 1-p 0 0 e p 1 1 1-p Capacity = (1 - p) • Binary Erasure Channel Kiel, February 2008

  4. Motivation • Automatic Repeat Request (ARQ) • Wasteful usage of bandwidth, network overloads and intolerable delays. • Forward Error Correcting (FEC) codes • Reed-Solomon codes, LDPC codes, Tornado codes. • Rate should be determined in compliance with the erasure probability p. • High computational cost of overall encoding and decoding. Kiel, February 2008

  5. FOUNTAIN CODES • Rateless: Number of code symbols, which can be generated from input info symbols is potentially unlimited. • Capacity achieving: Decoder can recover info symbols from any set of code symbols, which is only slightly longer than input message length. • Universal: Fountain Codes are near optimal for any BEC. Kiel, February 2008 5

  6. Fountain Codes For info symbols s1, s2…sK , code symbols t1, t2 … and random generator matrix G, • Encoding: Example: Kiel, February 2008

  7. FOUNTAIN CODES For info symbols s1, s2…sK , code symbols t1, t2 … and random generator matrix G, • Decoding: Example: Easier to calculate using Tanner Graph Kiel, February 2008 7

  8. Fountain Codes [ ] 1 0 1 ? ? ? . . . ] . . . [ 1 1 1 0 0 Kiel, February 2008 8

  9. Fountain Codes . . . • Encoding complexity of and decoding • complexity of , which makes them impractical. Kiel, February 2008

  10. Degree Distribution • Degree is the number of edges connecting an encodedsymboltothe input symbols. • Degree distribution is the probability density function of all degrees used for encoding. • Average degree of the encoded symbols should be at least for a realiable decoding. • Degree distribution affects the encoding and decoding costs. Kiel, February 2008 Kiel, February 2008 10

  11. LT Codes • First practical implementation of Fountain Codes, proposed by Michael Luby. • LT codes are rateless and universal. • Computational complexityof both the encoding and the decoding process increase logarithmically with the increase of the data length. • Therefore, compared to Fountain Codes, LT codes provide a considerably reduced computational cost while achieving the capacity. Kiel, February 2008

  12. LT Codes -“Encoding” • Encoding Algorithm • Divide the message M into equi-length parts of k bits resulting in K number ofsymbols. M k Kiel, February 2008

  13. LT Codes -“Encoding” • Encoding Algorithm • Randomly choose the degree d of the encoding symbol from a degree distribution . d=3 Kiel, February 2008

  14. LT Codes -“Encoding” • Encoding Algorithm • Choose randomly distinct input symbols as the edges of the encoding symbol in the tanner graph. 2 6 1 4 6 1 3 5 4 d=3 Kiel, February 2008

  15. LT Codes -“Encoding” • Encoding Algorithm • Determine the encoding symbol as bitwise modulo 2 sum of the edge symbols. 2 1 4 6 3 5 d=3 Kiel, February 2008

  16. 2 1 (4,6) (4) 2 2 1 1 1 d (1,2) 3 1 2 (5,6) (3) (2) (5) v (1,4,6) (3) (1,3) LT Codes -“Encoding” 1 2 5 3 4 6 Modulo 2 Addition Kiel, February 2008

  17. LT Codes -“Decoding” • Decoding Algorithm • Find a check node with degree one and assign its value to the corresponding input symbol. If there is no such a node halt the decoding and report the failure of decoding. • Add this value to all check nodes connected to this input symbol. • Remove all edges from the graph, which are connected to the related input symbol. • Repeat the first three steps until all input symbols are recovered. Kiel, February 2008

  18. LT Codes -“Decoding” 2 (4,6) 2 1 1 d (1,2) 3 2 (3) (5) v (1,4,6) (1,3) Decoding Failure! . . . . Kiel, February 2008 18

  19. LT Codes -“Decoding” 2 1 (4,6) (4) 2 1 1 d (1,2) 3 2 (3) (5) v (1,4,6) (1,3) Decoding Successful! . . . Kiel, February 2008 19

  20. Raptor Codes • An extension of LT codes, introduced by Shokrollahi. • Core idea - “To relax the condition of recovering all input symbols and to require only a constant fraction of input symbols be recoverable.” • Idea achieved by concatenation of an LT code and a precode. • LT code recovers a large proportion of input symbols. • Precode recovers the fraction unrecovered by LT code. • Encoding and decoding complexity increases linearly with K. Kiel, February 2008 20

  21. Raptor Codes “Encoding” Low Density Parity Check Coding LT Coding Pre-coding LDPC Coding LT Coding Kiel, February 2008 21

  22. Raptor Codes “Decoding” Recovered Message Symbols Decoding Successful! Unrecovered Symbols LDPC DECoding LT deCoding Erased Symbol Received Symbols Kiel, February 2008 22

  23. RAPTOR CODES on Noisy Channels • Raptor Codes can efficiently be used over noisy channels with the same encoding scheme that we have previously described and BP algorithm using soft inputs. • AWGN Channel with Es/N0=-2.83dB and K=9500 info-bits, Nav=20737 code-bits, Cap=0.5bit/symbol, Rav=0.458bit/symbol. • Tends to approach the capacity with the increase of message length on both AWGN and the fading channel with Rayleight distribution. Kiel, February 2008

  24. RAPTOR CODES on Noisy Channels • Capacity achieving Raptor Codes haven’t been proven yet for other symetric channels. However, it is proven that Raptor Codes are not universal for all rates for symetric channels other than BEC. • Generalized Raptor Codes outperform ordinary Raptor Codes using rate-compatible distribution arrangement on BSM and AWGN channels. Kiel, February 2008 24

  25. Conclusion • Fountain Codes • Advantage: Rateless, universal and capacity achieving. • Disadvantage: Higher encoding and decoding complexity. • LT Codes • Advantage: Lower complexity than Fountain Codes. • Disadvantage: Complexity increases logarithmically with the message length. • Raptor Codes • Advantage: The lowest complexity achievable • Can be applied to arbitrary channels efficiently. Kiel, February 2008

  26. Thank you !

More Related