1 / 21

An improved LT encoding scheme with extended chain lengths

An improved LT encoding scheme with extended chain lengths. Kuo-Kuang Yen, Chih -Lung Chen and Hsie -Chia Chang. ISITA2012, Honolulu, Hawaii, USA, October 28-31, 2012. Outline. Introduction Background Analysis of early decoding termination Decoding chain

jake
Download Presentation

An improved LT encoding scheme with extended chain lengths

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. An improved LT encoding scheme with extended chain lengths Kuo-Kuang Yen, Chih-Lung Chen and Hsie-Chia Chang ISITA2012, Honolulu, Hawaii, USA, October 28-31, 2012

  2. Outline • Introduction • Background • Analysis of early decoding termination • Decoding chain • Averages chain length of decoding termination of different ranges • Proposed encoding scheme • Simulation results

  3. Introduction • Fountain code • Randomly generates encoding symbols without a fixed code rate • Can be designed without considering the channel condition • LT code : k input symbols encoding symbol Original data 1 2 3 4 k … XOR … a b c Degree:3 Neighbors: 1 , 3 , 4

  4. [13] R. Karp, M. Luby, and A. Shokrollahi, “Finite length analysis of LT codes,” in International Symposium on Information Theory, ISIT, 2004, p. 39. [14] G. Maatouk and A. Shokrollahi, “Analysis of the second moment of the LT decoder,” in International Symposium on Information Theory, ISIT, 2009, pp. 2326–2330. Introduction • LT code • Ripple : • Received degree-1 encoding symbols • Those whose degree descends to one • The BP (Belief-Propagation) decoding terminates when these ripples are depleted. • The average number of ripples is relatively small • The decoding termination occurs more frequently [13] [14]. • For the LT codes with the robust Solitondistribution • The number of ripples is initially low. • The BP decoding terminates early • Leading to most of undecodedinput symbols. • The decoding termination within the range 0 ≤ n ≤ k/2 generally contributes to more than 90% of undecoded input symbols. n = the number of decoded input symbols

  5. In this paper • Decoding chains : • Input symbols can be divided into decoding chains by the connectivity of received degree-2 encoding symbols. • The average chain length is shorter when the decoding termination occurs within the range 0 ≤ n ≤ k/2 than k/2+1 ≤ n ≤ k. • Goal : • Increasing the average chain length • Reducing the number of the decoding termination within 0 ≤ n ≤ k/2 for lower symbol loss probability Symbol loss probability : The probability that an input symbol is undecoded after the BP decoding

  6. Background : LT codes and its encoding

  7. Background : BP decoding

  8. Background : degree distribution

  9. Background : drawback of LT codes with RSD • The decoding termination within 0 ≤ n ≤ n of all undecoded input symbols : The probability of the decoding termination with n input symbols decoded

  10. Background : drawback of LT codes with RSD • Percentage of undecoded input symbols resulting from the decoding termination : • The BP decoding terminates within 0 ≤ n ≤ k/2 generally contributes to more than 90% of undecoded input symbols when encoding with the robust Soliton distribution. Robust Soliton distribution ε=0.1 c=0.02 δ=0.01

  11. Analysis of early decoding termination :Decoding chain • Decoding chains : • Input symbols can be divided into decoding chains by the connectivity of received degree-2 encoding symbols. • Definition : A length-L decoding chain is constituted of L different input symbols it1 , it2 , ・ ・ ・ such that φ(i1) = {c1} φ(i2) = {∅} φ(i3) = {c1} φ(i4) = {c4, c5} φ(i5) = {c5, c6} φ(i6) = {c6, c7} φ(i7) = {c4, c7}

  12. Analysis of early decoding termination :Averages chain length of decoding termination • : The number of input symbols connected by received degree-2 encoding symbols . • : The probability that an input symbol belongs to a length-L decoding chain • : The average chain length The average of k,, and over two different ranges: The probability of the decoding termination with n input symbols decoded

  13. Analysis of early decoding termination :Averages chain length of decoding termination • The average chain length in 0 ≤ n ≤ k/2 is much shorter than in k/2 +1 ≤ n ≤ k • We want to increase the average chain length in 0 ≤ n ≤ k/2

  14. k Proposed encoding scheme k-τ τ • We separate input symbols into two groups • Neighbors of any degree-2 encoding symbol are restricted from different groups of input symbols.

  15. Proposed encoding scheme • Increasing the probability of two degree- 2 encoding symbols sharing one of their neighbors • Extending the average chain length • When τ= 1 , Assuming : gdegree- 2 encoding symbols g connections to g connections to the rest k – 1 inputs k k-1 1 XOR … a b c

  16. Proposed encoding scheme • The probability that one of these k− 1 input symbols is connected by the received degree-2 encoding symbols is • On average, input symbols form a single decoding chain • Each of the k input symbols has probability to be in the decoding chain The average decoding chain length The average number of redundant symbols

  17. Proposed encoding scheme • For example, with k = 1000 , ε= 0.1 , Ω2= 0.5 and τ= 1 , • On average • The chain length : 179.4 • The redundant symbols :125.5. • The average chain length is extended • The remaining number of received encoding symbols after removing the redundant ones is 1100−125.5 = 974.5 • Insufficient to decode all input symbols. • As τ grows, the number of redundant symbols is reduced.

  18. Simulation results • Using robust Soliton distribution of parameters c = 0.02 , δ= 0.01 ,overhead ε = 0.2 • The number τ is optimized for the symbol loss probability, • τ = 40 for k = 250 • τ = 100 for k=500 • τ = 210 for k=1000 • τ = 420 for k=2000 • Encoding symbols are transmitted without erasure.

  19. Simulation results

  20. Simulation results

  21. Simulation results

More Related