Department of Electrical Engineering École Polytechnique de Montréal. David Haccoun, Eng., Ph.D. Professor of Electrical Engineering Life Fellow of IEEE Fellow , Engineering Institute of Canada. Engineering training in Canada. 36 schools/faculties. 3. 1. 2. 1. Vancouver. 2. 11. 13.
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
Department of Electrical Engineering
École Polytechnique de Montréal
David Haccoun, Eng., Ph.D.
Professor of Electrical Engineering
Life Fellow of IEEE
Fellow , Engineering Institute of Canada
Engineering training in Canada
36 schools/faculties
3
1
2
1
Vancouver
2
11
13
1
2
Montréal
Undergraduate students
Canada: 55,000
Québec: 14,600
Toronto
The oldest engineering school in Canada .
The thirdlargest in Canada for teaching and research.
The first in Québec for the student body size.
Operating budget $85 million Canadian Dollars (C$).
Annual research budget $60.5 million C$.
Annual grants and research contracts $38 million C$.
15 Industrial Research Chairs.
24 Canada Research Chairs.
7863 scientific publications over the last decade.
220 professors, and 1,100 employees.
1,000 graduates per year, and 30,000 since 1873.
11 engineering programs
Polytechnique
Novel Iterative Decoding Using Convolutional Doubly Orthogonal Codes
A simple approach to capacity
David Haccoun
Éric Roy, Christian Cardinal
Modern Error Control Coding Techniques
Based on Differences Families
7
8
D2
D
Dm
A
J
m
– Set of connection positions
– Number of connection positions
– Memory length
– Coding span
OneDimensional NCDO Codes
Information sequence
Shift register of length m
0 1 2 m1 m
...
...
D1
AWGN
Channel
=0
=m
... ...
... ...
J
Parity sequence
9
+
Differencesaredistinct
Example of Convolutional SelfOrthogonal Code
CSOC, R=1/2, J=4, m=15,
10
Example of CSOC, J=4,
DistinctSimple Differences
11
Threshold (TH) Decoding of CSOC
12
D
D
D
S
>
<
D
D
D
Soft outputs
in LLR
(2 1)=1
(3 0)=3
(3 1)=2
0
0
1
ûi
Decoded
bits
1 =0
3 =3
2 =1
= tanh/tanh1(sumproduct)or addmin (minsum) operator
, are LLRs values representing the received symbols ,
13
Novel Iterative Error Control Coding Schemes
Issues : Search and determination of new CSO2Cs
Extention of Golomb rulers problem (unsolved)
14
Example of CSO2C, J=4,
Differences of Differences
(1,3,1,3)=((12)(12))= 24
(2,0,1,0)=((13) ( 3))= 16
(2,0,2,0)=((13)(13))= 26
(2,1,0,1)=((10)( 3))= 7
(2,1,2,0)=((10)(13))= 23
(2,1,2,1)=((10)(10))= 20
(2,3,0,1)=(( 2)( 3))= 5
(2,3,0,3)=(( 2)( 15))= 17
(2,3,1,0)=(( 2)( 3))= 1
(2,3,1,3)=(( 2)( 12))= 14
(2,3,2,0)=(( 2)(13))= 11
(2,3,2,1)=(( 2)(10))= 8
(2,3,2,3)=(( 2)( 2))= 4
(3,0,1,0)=((15)( 3))= 18
(0,1,0,1)=(( 3)( 3))= 6
(0,2,0,1)=((13)( 3))= 16
(0,2,0,2)=((13)(13))= 26
(0,3,0,1)=((15)( 3))= 18
(0,3,0,2)=((15)(13))= 28
(0,3,0,3)=((15)(15))= 30
(1,0,1,0)=(( 3)( 3))= 6
(1,2,0,2)=((10)(13))= 23
(1,2,1,0)=((10)( 3))= 7
(1,2,1,2)=((10)(10))= 20
(1,3,0,2)=((12)(13))= 25
(1,3,0,3)=((12)(15))= 27
(1,3,1,0)=((12)( 3))= 9
(1,3,1,2)=((12)(10))= 22
(3,0,2,0)=((15)(13))= 28
(3,0,3,0)=((15)(15))= 30
(3,1,0,1)=((12)( 3))= 9
(3,1,2,0)=((12)(13))= 25
(3,1,2,1)=((12)(10))= 22
(3,1,3,0)=((12)(15))= 27
(3,1,3,1)=((12)(12))= 24
(3,2,0,1)=(( 2)( 3))= 1
(3,2,0,2)=(( 2)( 13))= 11
(3,2,1,0)=(( 2)( 3))= 5
(3,2,1,2)=(( 2)( 10))= 8
(3,2,3,0)=(( 2)(15))= 17
(3,2,3,1)=(( 2)(12))= 14
(3,2,3,2)=(( 2)( 2))= 4
15
Spans of some best known CSO2C encoders
16
NonIterative Threshold Decoding for CSOCs
Approximate MAP valueli:
Extrinsic
Information
Received Inform. Symb.
=
+
: Addmin operator;
where
Decision rule :
, otherwiseûi= 0
ûi=1 if and only ifli
0
CSOC i is an equation of independent variables
17
Depends on the simple differences
Estimation of at Iteration
Feedforward for future symbols
Feedback for past symbols
depends on the simple differences
and on
the differences of differences
Iterative Threshold Decoding for CSO2Cs
General Expression:
Iterative Expressions:
18
Iterative Threshold Decoder Structure for CSO2Cs
Delay m
Delay m
Delay m
Delay m
Delay m
Delay m
ForwardOnly Iterative Decoder
Last Iteration
...
...
Soft
output
Soft output
Soft
output
Soft
output
Information
symbols
threshold
threshold
threshold
...
threshold
...
decoder
decoder
decoder
decoder
Iteration
Iteration
Iteration
Iteration
=1
=M
=2
=I
Hard Decision
From channel
Paritycheck
symbols
...
...
Decoded
Information
symbols
Features:
19
Block Diagram of Iterative Threshold Decoder (CSO2Cs)
Latency m bits
Latency m bits
Latency m bits
Input
For
Output
For
Total Latency M mbits
20
Iterative Belief Propagation (BP) Decoder of CSO2C
p
p
w
w

t
Mm
t
u
u
w
w

t
Mm
DEC 1
DEC 2
DEC
M
t
0
l
l
(
2
)
(
M
)
ˆ
l
(
1
)
u




2
t
Mm
t
m
t
Mm
t
m
1
Threshold Decoder
(TH)
Latency
m bits
Latency
m bits
Latency
m bits
BP Decoder
p
p
w
w

t
Mm
t
u
w
u
w

t
Mm
DEC 1
DEC
2
DEC
M
t
(BP)
(
)
M
v
{
}
Latency m bits
Latency m bits

,
t
Mm
j
Latency m bits
0
l
l
(
2
)
(
M
)
ˆ
l
(
1
)
u




2
t
Mm
t
m
t
Mm
t
m
1
M(BP)~ ½M(TH)
BP Latency ~ ½ TH Latency
1step BP complexity ~ J X 1step TH complexity
21
Error Performance Behaviors of CSO2Cs
J=9, A={0, 9, 21, 395, 584, 767, 871, 899, 912}
BP
Waterfall region
TH
Waterfall region
BP
Error floor region
TH
Error floor region
TH, 8 iterations
Both BP and TH decoding approach the asymptotic error performance in error floor region
BP, 8 iterations
BP, 4 iterations
22
Analysis Results of CSO2Cs
23
Definition of SCSO2Cs
Normalized simplification factor
,
Maximal number of distinct differences of differences
(excluding the unavoidable repetitions)
Number of repeated differences of differences
(excluding the unavoidable repetitions)
yielding value
24
Comparison of Spans of CSO2Cs and SCSO2Cs
25
Uncoded BPSK
coding gain
asymptotic coding gain
Performance Comparison for J=10 SCSO2C
26
Performance Comparison for J=8 Codes (BP Decoding)
CSO2C: A = { 0, 43, 139, 322, 422, 430, 441, 459 }SCSO2C: A = { 0, 9, 22, 55, 95, 124, 127, 129 }
27
Eb/No = 3.5 dB
8th iteration
CSO2C
BER
SCSO2C
14000
3000
Latency (x 104 bits)
Performance Comparison CSO2Cs / SCSO2Cs (TH Decoding)
28
Small Span
Analysis of Orthogonality Properties (span)
Convolutional SelfOrthogonal Codes
(CSOC)
Simple
Orthogonality
Extension
Orthogonalproperties of set A
Convolutional SelfDoublyOrthogonal Codes (CSO2C)
Double
Orthogonality
Large Span
Relaxed Conditions
Relaxed Double
Orthogonality
Simplified CSO2C
(SCSO2C)
Substantial
Span Reduction
29
Analysis of Orthogonality Properties (computational tree)
Decoded symbol
Independency VS Short cycles
(no descendant nodes).
LLR for final hard decision
Iter (1)
Iter (2)
30
Analysis of Orthogonality Properties (cycles)
Conditions
on associated sets
Cycles on Graphs
Codes
No 4cycles
Distinct differences
CSOC
Minimization of
Number of
6cycles
Uniformly Distributed
Distinct differences
from difference of differences
CSO2C
SCSO2C
Minimization of
Number of
8cycles
Distinct differences of differences
Uniformly Distributed
A Number of Additional
8cycles
A number of repetitions of differences of differences
Approximately Uniformly Distributed
31
Summary of Single Register CSO2Cs
Relaxing doubly orthogonal conditions of CSO2C adds some 8cycles leading to codes with substantially reduced coding spans SCSO2C
Eb/N0 values.
32
In order toimprovetheerror performancesof the iterative decoding algorithmthe degree of the parity symbolsmust be increased
Extension : Recursive Convolutional DoublyOrthogonalCodes (RCDO)
Solution : Use Recursive
Convolutional Encoders
(RCDO)
33
3rd register
2nd register
1st register
Forward connections
Feedback connections
34
The paritycheck matrix HT(D) completely defined the RCDO codes.
The memory of the RCDO encoder mis defined by the largest shift register of the encoder
Each line of HT(D)represents one output symbol of the encoder.
Each column of HT(D) represents one constraint equation.
Protograph representation of a RCDO codes is defined by HT(D).
The degree distributions of the nodes in the protograph become important in the convergence behavior of the decoding algorithm.
Regular RCDO (dv, dc) : dv = degree of variable (rows)
dc = degree of constraint (col.) (same numbers of nonzero elements of HT(D) )
Irregular RCDO protograph
35
36
50thiter
LDPC
n=1008
decoder limit’
RCDO (3,6)
1.10 dB
Increasing
number of shift registers
VS
number of shift registers
Characteristics :
Coding rate15/30
15 registers
m = 149
Regular HT(D) (3,6)
40th Iteration
Close to optimal convergence behavior of the iterative decoder.
After 40 iterations
0.4 dB
Low error floor
38
Pb = 105
CSO2C
good error performances at moderate SNR
RCDO
good error performance at low SNR
39
Block lengthN, IterationsM
40
41
Merci
THANK YOU
42