1 / 42

OPTIMIZATION of GENERALIZED LT CODES for PROGRESSIVE IMAGE TRANSFER

OPTIMIZATION of GENERALIZED LT CODES for PROGRESSIVE IMAGE TRANSFER. Suayb S. Arslan, Pamela C. Cosman and Laurence B. Milstein Department of Electrical and Computer Engineering University of California, San Dieg o.

season
Download Presentation

OPTIMIZATION of GENERALIZED LT CODES for PROGRESSIVE IMAGE TRANSFER

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. OPTIMIZATION of GENERALIZED LT CODES for PROGRESSIVE IMAGE TRANSFER Suayb S. Arslan, Pamela C. Cosman and Laurence B. Milstein Department of Electrical and Computer Engineering University of California, San Diego

  2. Main Content:“OPTIMIZATION of GENERALIZED LT CODES for PROGRESSIVE IMAGE TRANSFER” Outline • Transmission problem and motivation. • Background: • Fountain (Rateless) Codes • Encoding and Decoding. • Previous Unequal Error Protection (UEP) Rateless codes • Proposed algorithms: Generalized Unequal Error Protection LT Codes • Selection of distributions • Progressive transmission system • Simulation: • Performance comparisons • References

  3. Transmission problem: • Point to point transmission: • Transmit information from one sender to another where the erasure channel between the sender and the receiver has a time varying and unknown erasure probability. • OBJECTIVE:Transmission rate is close to the capacity of the channel. • Multicast transmission: • Transmit information from one sender to multiple receivers where the channel between sender and the each receiver is an erasure channel with unknown erasure probability. • OBJECTIVE:Transmission rate is close to the capacity on all the transmission channels, simultaneously.

  4. Motivation • Erasure channels: In a number of communications scenarios, data files sent over the internet are chopped into fixed or variable size packets, and each packet is either received without error or corrupted and therefore considered erased during the transmission. • Solution 1: A way of solving the transmission problem for erasure channels is to use forward error correction. This may lead to inefficient use of network resources when the channel information is missing. • Solution 2: Receivers acknowledge each received packet and senders retransmit the lost packets. This results in low efficiency and the capacity is wasted by feedback messages and retransmissions. The magnitude of the waste is exacerbated in a multicast scenario. BinaryErasureChannel

  5. A solution: “Digital Fountain Idea” • A paradigm for data transmission, without the need for almost any feedback messages. • What is received or lost is of no importance. It only matters whether enough is received.

  6. Luby Transform (LT) codes (Luby ‘98) • First known fountain code design. • The basis for other fountain codes: Raptor codes or Online codes. • Send k information symbols. Only n = (1+e)k coded symbols are enough to recover all k information symbols, where e is the overhead. • Asymptotically capacity achieving: • BEC channel have an erasure probability: and capacity is • Number of symbols transmitted: and expected number of reliably received symbols is given by: • Rate of the transmission: • As k gets larger, the goes to 0, and . Thus, LT codes achieves capacity asymptotically. • Low encoding/decoding complexity. [■] M. Luby, “LT-Codes”, Proc. 43rd Annual IEEE symposium on Foundations of Computer Science, pp. 271-280, 2002.

  7. Encoding… Degree Distribution Selection Distribution • Finally, we XOR selected information symbols of x to produce the coded symbol. • This process is repeated every time a new coded symbol is desired. Example: INFORMATION SYMBOLS CODED SYMBOLS RECEIVED and UNERASED

  8. Encoding… Degree Distribution Selection Distribution • Finally, we XOR selected information symbols of x to produce the coded symbol. • This process is repeated every time a new coded symbol is desired. Example: INFORMATION SYMBOLS CODED SYMBOLS RECEIVED and UNERASED

  9. Encoding… Degree Distribution Selection Distribution • Finally, we XOR selected information symbols of x to produce the coded symbol. • This process is repeated every time a new coded symbol is desired. Example: INFORMATION SYMBOLS CODED SYMBOLS RECEIVED and UNERASED

  10. Encoding… Degree Distribution Selection Distribution • Finally, we XOR selected information symbols of x to produce the coded symbol. • This process is repeated every time a new coded symbol is desired. Example: INFORMATION SYMBOLS CODED SYMBOLS RECEIVED and UNERASED

  11. Encoding… Degree Distribution Selection Distribution • Finally, we XOR selected information symbols of x to produce the coded symbol. • This process is repeated every time a new coded symbol is desired. Example: INFORMATION SYMBOLS CODED SYMBOLS RECEIVED and UNERASED

  12. Encoding… Degree Distribution Selection Distribution • Finally, we XOR selected information symbols of x to produce the coded symbol. • This process is repeated every time a new coded symbol is desired. Example: INFORMATION SYMBOLS CODED SYMBOLS RECEIVED and UNERASED

  13. Decoding… • Coded symbols are sent over a binary erasure channel. Decoder uses a Belief Propagation (BP) algorithm. Figure: n=5

  14. How to choose SD and DD? • Ripple is defined to be the set of check nodes of degree 1 in each iteration of the BP. • Thus, in order for BP not to terminate, Ripple has to have at least one element in each iteration. • Luby proposed Soliton distribution that achieves an expectedRipple size 1 in each iteration of BP (POOR in practice). • Robust Soliton Distribution (GOOD in practice: ExpectedRipple size > 1 ). • In original LT coding, SD is assumed to be uniform distribution. • OBJECTIVE of the original design: • Given the uniform SD, find the best DD that will achieve the least number of received unerased coded symbols while decoding the whole information block with negligible failure probability. • Original design does not provide Unequal Error Protection (UEP). • In multimedia communications, OBJECTIVE is not necessarily the OBJECTIVE of the original design.

  15. Previous UEP Rateless codes: • There are two major studies in literature: • (1)Weighted approach: Modification to SD (skewed SD). Example: r = 2

  16. Previous UEP Rateless codes: • (2)Expanding Window Fountain (EWF) codes: a window-specific DD i.e., r different DDs can be used. (MORE FLEXIBLE than weighted approach) Example: r = 2

  17. An observation: • Let us observe the following: • Decoding stage 1: A degree-1 check node decodes an information symbol. • Decoding stage 2: Some of the degree-2 check nodes decode two information symbols. • Decoding stage 3: A degree-3 check node decodes an information symbol. • Conclusion: low degree coded symbols decode information symbols earlier (early iterations) in BP. • This can be used for prioritized decoding.

  18. Proposed algorithms: Generalized Unequal Error Protection LT Codes-Idea: Degree Dependent Selection Distributions • “Degree dependent selection” idea: WEIGHTED APPROACH • After selecting the degree number for the coded symbol, select the edge connections based on that degree number. • If the degree number is , the information chunks • Since these probabilities must sum to 1 and , there are (r-1)k parameters subject to optimization. • “Degree dependent selection” idea: EWF APPROACH • After selecting the degree number for the coded symbol, select the windows based on that degree number. • If the degree number is , the windows

  19. Selection of distributions: • Instead of designing all (r-1)k parameters, we introduce a functional dependence to reduce the parameter size. • Number of parameters are reduced to 3(r-1).

  20. Selection of distributions: (Luby) (Luby) (Proposed)

  21. Progressive transmission system: • The reason for using such a demultiplexing methodology is that the proposed coding scheme is most powerful when the source bits within each segment have unequal importance. • Using demultiplexing, for example, the information bits in the first block, the most important information block, are equally shared by the segments.

  22. Optimization of the rateless code: • Let the BP algorithm iterate M times. The optimization problem is: • l, , are degree and selection distributions of the proposed LT code. This implies that we optimize both of the distributions for a minimum-distortion criterion. • Maximum for M is set to in this study. • The minimization can be done at any iteration index m, • This way, we can present performance as a function of iteration index. This property may particularly be important for portable devices which are constrained by low-complexity receiver structures. We call this property Unequal Iteration Time (UIT) property.

  23. Packetization methodology: Fixed packet size k information symbols PACKETS are EITHER RECEIVED or LOST

  24. Packetization methodology : Fixed packet size k information symbols PACKETS are EITHER RECEIVED or LOST

  25. Packetization methodology : Fixed packet size k information symbols PACKETS are EITHER RECEIVED or LOST  Therefore, each LT codeword experiences the same erasure pattern.

  26. Alternative methodology: Variable packet size PACKETS are EITHER RECEIVED or LOST k information symbols

  27. Alternative methodology: Variable packet size PACKETS are EITHER RECEIVED or LOST k information symbols

  28. Alternative methodology: Overhead Allocation k information symbols

  29. Alternative methodology: Overhead Allocation k information symbols

  30. Numerical Results: Comparisons with “weighted approach” • Simulation set-up/parameters: • Take B = 50000 source bits ( 512 X 512 Lena image using SPIHT). • Chop it into k equal segments each containing bits. • Treat segments as information symbols and encode using proposed codes to produce coded symbols. • We first compare “weighted approach” with the proposed “UEP GLT” : • Robust Soliton distribution with c=0.01, =0.01. • r = 2 with k information symbols are treated as the first information chunk, the rest as the second information chunk. • Optimize both systems  optimize their design parameters. • “weighted approach”: Only one parameter . We optimize . • “UEP GLT” (GLTexp): Three parameters: . For simplicity we set . We optimize two parameters.

  31. Numerical Results: Comparisons with “weighted approach”

  32. Numerical Results: Comparisons with “weighted approach” URT property 166

  33. Numerical Results: Comparisons with “weighted approach” 30.8 UEP property URT property 166

  34. Numerical Results: : Comparisons with “weighted approach”

  35. Numerical Results: Comparisons with “weighted approach”

  36. Numerical Results: Comparisons with “weighted approach” UIT property

  37. Numerical Results: Comparisons with EWF codes • We compare “EWF codes” and the proposed “UEP GLT” : • Number of chunks/windows: r = 2 . • Optimize both systems  optimize their design parameters. • “EWF codes”: • Two Robust Soliton distribution (RSD) with c =0.01, = 0.01. Since the window sizes are different RSDs are different. • Two parameters to optimize: Window selection probability , and the chunk size parameter . Let denote the optimal parameters of the EWF code. • “UEP GLT”: • Use the compound degree distribution obtained by convex combination of the two RSDs used for EWF codes above. • GLTexp: This system uses and . • GLTexpOpt: This system uses only . • GLTexpFullOpt: No constraints.Five parameters to optimize: .

  38. Numerical Results: Comparisons with EWF codes

  39. Numerical Results: Comparisons with EWF codes UEP property URT property

  40. Numerical Results: Optimal parameters

  41. References: [1] M. Luby, ``LT-Codes", Proc. 43rd Annual IEEE symposium on Foundations of Computer Sciencel, pp. 271-280, 2002. [2] A. Shokrollahi, “Raptor codes,” IEEE Trans. Inf. Theory, vol. 52, no. 6, pp. 2410–2423, Jun. 2006. [3] N. Rahnavard and F. Fekri, “Finite-Length Unequal Error Protection Rateless Codes: Design and Analysis", IEEE Globecom 2005. [4] N. Rahnavard, Badri N. Vellambi and F. Fekri, “Rateless Codes with Unequal Protection Property", IEEE Trans. Inf. Theory, Vol. 53, No. 4, pp. 1521-1532, April 2007. [5] D. Sejdinovic, D. Vukobratovic, A. Doufexi, V. Senk and R. Piechocki, “Expanding window Fountain codes for Unequal Error Protection", Proc. 41st Asilomar Conf., Pacific Grovem pp. 1020-1024, 2007. [6] D. Vukobratovic, V. Stankovic, D. Sejdinovic, L. Stankovic and Z. Xiong, “Scalable Video Multicast using Expanding Window Fountain Codes”, IEEE Trans. on Multimedia., Vol. 11, No. 6, pp. 1094–1104, Oct. 2009. [7] M. Luby, Mitzenmacher, and A. Shokrallahi, “Analysis of random processes via and-or tree evaluation," in Proc. 9th Ann. ACM-SIAM Symp. Discrete Algorithms, 1998, pp.364-373.

  42. Questions ?

More Related