reporter advisor prof y m huang n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Reporter :林煜星 Advisor : Prof. Y.M. Huang PowerPoint Presentation
Download Presentation
Reporter :林煜星 Advisor : Prof. Y.M. Huang

Loading in 2 Seconds...

play fullscreen
1 / 41

Reporter :林煜星 Advisor : Prof. Y.M. Huang - PowerPoint PPT Presentation


  • 108 Views
  • Uploaded on

Iterative Joint Source-Channel Soft-Decision Sequential Decoding Algorithms for Parallel Concatenated Variable Length Code and Convolutional Code. Reporter :林煜星 Advisor : Prof. Y.M. Huang. Outline. Introduction Related Research Transmission Model for BCJR Simulation for BCJR Algorithm

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Reporter :林煜星 Advisor : Prof. Y.M. Huang' - celerina-andeana


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
reporter advisor prof y m huang

Iterative Joint Source-Channel Soft-Decision Sequential Decoding Algorithms for Parallel Concatenated Variable Length Code and Convolutional Code

Reporter:林煜星

Advisor:Prof. Y.M. Huang

outline
Outline
  • Introduction
  • Related Research
    • Transmission Model for BCJR
    • Simulation for BCJR Algorithm
  • Proposed Methodology
    • Transmission Model for Sequential
    • Simulation for Soft-Decision Sequential Algorithm
  • Conclusion
introduction

Discrete

source

Source

Encoder

Channel

Encoder

Modulator

錯誤更正碼

資料壓縮

User

Source

Decoder

Joint Decoder

Channel

Decoder

Demodulator

Introduction

Channel

related research
Related Research
  • [1]L. Guivarch, J.C. Carlach and P. Siohan
  • [2]M. Jeanne, J.C. Carlach, P. Siohan and L.Guivarch
  • [3]M. Jeanne, J.C. Carlach, Pierre Siohan
transmission model for bcjr

Huffman Coding

Turbo decoding

Utilization

of the SUBMAP

Independent

Source

or first order

Markov Source

Turbo Coding

parallel

concatenation

P symbols

K bits

a priori

Additive White

Gaussian Noise

Channel

Huffman

Decoding

K bits

P symbols

Transmission Model for BCJR
transmission model for bcjr independent source or first order markov source

Huffman Coding

Turbo decoding

Utilization

of the SUBMAP

Independent

Source

or first order

Markov Source

Turbo Coding

parallel

concatenation

P symbols

K bits

a priori

Additive White

Gaussian Noise

Channel

Huffman

Decoding

K bits

P symbols

Transmission Model for BCJR-Independent Source or first order Markov Source
transmission model for bcjr huffman codign

Huffman Coding

Turbo decoding

Utilization

of the SUBMAP

Independent

Source

or first order

Markov Source

Turbo Coding

parallel

concatenation

P symbols

K bits

a priori

Additive White

Gaussian Noise

Channel

Huffman

Decoding

K bits

P symbols

Transmission Model for BCJR-Huffman Codign
transmission model for bcjr turbo coding parallel concatenation

Huffman Coding

Turbo decoding

Utilization

of the SUBMAP

Independent

Source

or first order

Markov Source

Turbo Coding

parallel

concatenation

P symbols

K bits

a priori

Additive White

Gaussian Noise

Channel

Huffman

Decoding

K bits

P symbols

Transmission Model for BCJR-Turbo Coding parallel concatenation
transmission model for bcjr turbo coding parallel concatenation 1

u

d = (11101)

v

Transmission Model for BCJR-Turbo Coding parallel concatenation(1)

Non Systematic

Convolution code

transmission model for bcjr turbo coding parallel concatenation 2

(0,3)

(1,3)

(2,3)

(3,3)

(4,3)

(5,3)

1

0

10

01

(0,1)

(1,1)

(2,1)

(3,1)

(4,1)

(5,1)

01

(0,2)

(1,2)

(2,2)

(3,2)

(4,2)

(5,2)

00

(0,0)

(1,0)

(2,0)

(3,0)

(5,0)

(4,0)

11

Transmission Model for BCJR-Turbo Coding parallel concatenation(2)

d = (11101)

transmission model for bcjr turbo coding parallel concatenation 3
Transmission Model for BCJR-Turbo Coding parallel concatenation(3)

Recursive Systematic Convolution(RSC)

Rate=1/2

transmission model for bcjr turbo coding parallel concatenation 7
Transmission Model for BCJR-Turbo Coding parallel concatenation(7)

Turbo Code rate=1/2

transmission model for bcjr awgn

Huffman Coding

Turbo decoding

Utilization

of the SUBMAP

Independent

Source

or first order

Markov Source

Turbo Coding

parallel

concatenation

P symbols

K bits

a priori

Additive White

Gaussian Noise

Channel

Huffman

Decoding

K bits

P symbols

Transmission Model for BCJR-AWGN
transmission model for bcjr turbo decoding utilization of the submap

Huffman Coding

Turbo decoding

Utilization

of the SUBMAP

Independent

Source

or first order

Markov Source

Turbo Coding

parallel

concatenation

P symbols

K bits

a priori

Additive White

Gaussian Noise

Channel

Huffman

Decoding

K bits

P symbols

Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP
transmission model for bcjr turbo decoding utilization of the submap 2
Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP(2)

Logarithm of Likelihood Ratio(LLR)

Recall

transmission model for bcjr turbo decoding utilization of the submap 3

10

10

(4,3)

(5,3)

1

0

(4,1)

(5,1)

01

01

(5,2)

00

00

11

11

Transmission Model for BCJR-Turbo decoding Utilization of the SUBMAP(3)
simulation for bcjr algorithm
Simulation for BCJR Algorithm
  • The end of the transmission occurs when either the maximum bit error number fixed to 1000, or the maximum transmitted bits equal to 10 000 000 is reached.
  • Input date into blocks of 4096 bits
simulation for bcjr algorithm 1
Simulation for BCJR Algorithm(1)

1NP:1次iteration

independent source

No Use a priori probability

1NP:1次iteration

independent source

Use a priori probability

simulation for bcjr algorithm 2
Simulation for BCJR Algorithm(2)

1NP:

1次iteration

Markov Source

No use a proiri probability

1MP:

1次iteration

Markov Source

Use Markvo a priori

probability

simulation for bcjr algorithm 3
Simulation for BCJR Algorithm(3)

12D:

1次iteration

Independent Source

Use a priori probability

Bit time(level)、Convolution

state

13D:

1次iteration

Independent Source

Use a priori probability

Bit time(level)、tree state

Convolution state

proposed methodology
Proposed Methodology
  • [4]Catherine Lamy, Lisa Perros-Meilhac
transmission model for sequential sequential decoding 2

Example:

r=(-1, 3,2,1,-2,-1,-3,-1,1,2)

4

y=(1,0,0,0,1,1,1,1,0,0)

(4,3)

3

4

1

2

01

(1,1)

(2,1)

(3,1)

(3,1)

(5,1)

10

2

2

(5,1)

0

00

(4,2)

(4,2)

(0,0)

3

(4,2)

(1,1)

(4,2)

1

1

11

4

5

4

(3,1)

(1,1)

(4,3)

(3,1)

11

11

11

(1,0)

(0,0)

(1,0)

(2,0)

(2,0)

(3,0)

(5,0)

00

00

00

4

(2,0)

(3,0)

(3,0)

(4,3)

(2,0)

y=(00)

y=(00)

y=(10)

y=(11)

y=(11)

4

(1,1)

(1,0)

(1,0)

(3,0)

(2,1)

|r|=(12)

|r|=(21)

|r|=(13)

|r|=(31)

|r|=(21)

5

(2,1)

(1,1)

(2,1)

(5,0)

(0,0)

Transmission Model for Sequential-Sequential Decoding(2)

Origin node

(0,0)

Open

Close

simulation for sequential algorithm
Simulation for Sequential Algorithm

2D1:

1次iteration

Independent Source

Use a priori probability

Bit time(level)、Convolution

state

3D1:

1次iteration

Independent Source

Use a priori probability

Bit time(level)、Convolution

state、tree state

conclusion
Conclusion
  • Heuristic方法求Sequential Decoder Soft-Output value運用在Iterative解碼架構,雖然使錯誤降低,節省運算時間,但解碼效果無法接近Tubro Decoder的解碼效果,為來將繼續研究更佳的方法求Sequential Decoder Soft-Output value使解碼效果更逼近Turbo Decoder的解碼效果