1 / 41

Reporter :林煜星 Advisor : Prof. Y.M. Huang

Iterative Joint Source-Channel Soft-Decision Sequential Decoding Algorithms for Parallel Concatenated Variable Length Code and Convolutional Code. Reporter :林煜星 Advisor : Prof. Y.M. Huang. Outline. Introduction Related Research Transmission Model for BCJR Simulation for BCJR Algorithm

Download Presentation

Reporter :林煜星 Advisor : Prof. Y.M. Huang

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Iterative Joint Source-Channel Soft-Decision Sequential Decoding Algorithms for Parallel Concatenated Variable Length Code and Convolutional Code Reporter:林煜星 Advisor:Prof. Y.M. Huang

  2. Outline • Introduction • Related Research • Transmission Model for BCJR • Simulation for BCJR Algorithm • Proposed Methodology • Transmission Model for Sequential • Simulation for Soft-Decision Sequential Algorithm • Conclusion

  3. Discrete source Source Encoder Channel Encoder Modulator 錯誤更正碼 資料壓縮 User Source Decoder Joint Decoder Channel Decoder Demodulator Introduction Channel

  4. Related Research • [1]L. Guivarch, J.C. Carlach and P. Siohan • [2]M. Jeanne, J.C. Carlach, P. Siohan and L.Guivarch • [3]M. Jeanne, J.C. Carlach, Pierre Siohan

  5. Huffman Coding Turbo decoding Utilization of the SUBMAP Independent Source or first order Markov Source Turbo Coding parallel concatenation P symbols K bits a priori Additive White Gaussian Noise Channel Huffman Decoding K bits P symbols Transmission Model for BCJR

  6. Huffman Coding Turbo decoding Utilization of the SUBMAP Independent Source or first order Markov Source Turbo Coding parallel concatenation P symbols K bits a priori Additive White Gaussian Noise Channel Huffman Decoding K bits P symbols Transmission Model for BCJR-Independent Source or first order Markov Source

  7. Transmission Model for BCJR-Independent Source or first order Markov Source(1)

  8. Transmission Model for BCJR-Independent Source or first order Markov Source(2)

  9. Example: Transmission Model for BCJR-Independent Source or first order Markov Source(3)

  10. Huffman Coding Turbo decoding Utilization of the SUBMAP Independent Source or first order Markov Source Turbo Coding parallel concatenation P symbols K bits a priori Additive White Gaussian Noise Channel Huffman Decoding K bits P symbols Transmission Model for BCJR-Huffman Codign

  11. Transmission Model for BCJR-Huffman Coding

  12. Huffman Coding Turbo decoding Utilization of the SUBMAP Independent Source or first order Markov Source Turbo Coding parallel concatenation P symbols K bits a priori Additive White Gaussian Noise Channel Huffman Decoding K bits P symbols Transmission Model for BCJR-Turbo Coding parallel concatenation

  13. u d = (11101) v Transmission Model for BCJR-Turbo Coding parallel concatenation(1) Non Systematic Convolution code

  14. (0,3) (1,3) (2,3) (3,3) (4,3) (5,3) 1 0 10 01 (0,1) (1,1) (2,1) (3,1) (4,1) (5,1) 01 (0,2) (1,2) (2,2) (3,2) (4,2) (5,2) 00 (0,0) (1,0) (2,0) (3,0) (5,0) (4,0) 11 Transmission Model for BCJR-Turbo Coding parallel concatenation(2) d = (11101)

  15. Transmission Model for BCJR-Turbo Coding parallel concatenation(3) Recursive Systematic Convolution(RSC) Rate=1/2

  16. Transmission Model for BCJR-Turbo Coding parallel concatenation(4) Rate=1/4

  17. Interleaver Transmission Model for BCJR-Turbo Coding parallel concatenation(5)

  18. Turbo Code rate1/3 Transmission Model for BCJR-Turbo Coding parallel concatenation(6) Rate=1/4

  19. Transmission Model for BCJR-Turbo Coding parallel concatenation(7) Turbo Code rate=1/2

  20. Huffman Coding Turbo decoding Utilization of the SUBMAP Independent Source or first order Markov Source Turbo Coding parallel concatenation P symbols K bits a priori Additive White Gaussian Noise Channel Huffman Decoding K bits P symbols Transmission Model for BCJR-AWGN

  21. Transmission Model for BCJR-AWGN(1)

  22. Huffman Coding Turbo decoding Utilization of the SUBMAP Independent Source or first order Markov Source Turbo Coding parallel concatenation P symbols K bits a priori Additive White Gaussian Noise Channel Huffman Decoding K bits P symbols Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP

  23. priori Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP(5) BCJR2 BCJR1

  24. Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP(1) MAP Decoder Define

  25. Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP(2) Logarithm of Likelihood Ratio(LLR) Recall

  26. 10 10 (4,3) (5,3) 1 0 (4,1) (5,1) 01 01 (5,2) 00 00 11 11 Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP(3)

  27. Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP(4) (2,3) (3,3) (0,0) (5,0)

  28. priori Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP(5) BCJR2 BCJR1

  29. Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP(6)

  30. Simulation for BCJR Algorithm • The end of the transmission occurs when either the maximum bit error number fixed to 1000, or the maximum transmitted bits equal to 10 000 000 is reached. • Input date into blocks of 4096 bits

  31. Simulation for BCJR Algorithm(1) 1NP:1次iteration independent source No Use a priori probability 1NP:1次iteration independent source Use a priori probability

  32. Simulation for BCJR Algorithm(2) 1NP: 1次iteration Markov Source No use a proiri probability 1MP: 1次iteration Markov Source Use Markvo a priori probability

  33. Simulation for BCJR Algorithm(3) 12D: 1次iteration Independent Source Use a priori probability Bit time(level)、Convolution state 13D: 1次iteration Independent Source Use a priori probability Bit time(level)、tree state Convolution state

  34. Proposed Methodology • [4]Catherine Lamy, Lisa Perros-Meilhac

  35. priori Transmission Model for Sequential-Sequential Decoding BCJR2 Sequential

  36. 1 : if Code word bits Transmission Model for Sequential-Sequential Decoding(1) 0 : Otherwise

  37. Example: r=(-1, 3,2,1,-2,-1,-3,-1,1,2) 4 y=(1,0,0,0,1,1,1,1,0,0) (4,3) 3 4 1 2 01 (1,1) (2,1) (3,1) (3,1) (5,1) 10 2 2 (5,1) 0 00 (4,2) (4,2) (0,0) 3 (4,2) (1,1) (4,2) 1 1 11 4 5 4 (3,1) (1,1) (4,3) (3,1) 11 11 11 (1,0) (0,0) (1,0) (2,0) (2,0) (3,0) (5,0) 00 00 00 4 (2,0) (3,0) (3,0) (4,3) (2,0) y=(00) y=(00) y=(10) y=(11) y=(11) 4 (1,1) (1,0) (1,0) (3,0) (2,1) |r|=(12) |r|=(21) |r|=(13) |r|=(31) |r|=(21) 5 (2,1) (1,1) (2,1) (5,0) (0,0) Transmission Model for Sequential-Sequential Decoding(2) Origin node (0,0) Open Close

  38. Transmission Model for Sequential-Sequential Decoding(2)

  39. Simulation for Sequential Algorithm 2D1: 1次iteration Independent Source Use a priori probability Bit time(level)、Convolution state 3D1: 1次iteration Independent Source Use a priori probability Bit time(level)、Convolution state、tree state

  40. Conclusion • Heuristic方法求Sequential Decoder Soft-Output value運用在Iterative解碼架構,雖然使錯誤降低,節省運算時間,但解碼效果無法接近Tubro Decoder的解碼效果,為來將繼續研究更佳的方法求Sequential Decoder Soft-Output value使解碼效果更逼近Turbo Decoder的解碼效果

More Related