Loading in 2 Seconds...
Loading in 2 Seconds...
Department of Electrical Engineering École Polytechnique de Montréal. David Haccoun, Eng., Ph.D. Professor of Electrical Engineering Life Fellow of IEEE Fellow , Engineering Institute of Canada. Engineering training in Canada. 36 schools/faculties. 3. 1. 2. 1. Vancouver. 2. 11. 13.
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
École Polytechnique de Montréal
David Haccoun, Eng., Ph.D.
Professor of Electrical Engineering
Life Fellow of IEEE
Fellow , Engineering Institute of Canada
36 schools/faculties
3
1
2
1
Vancouver
2
11
13
1
2
Montréal
Undergraduate students
Canada: 55,000
Québec: 14,600
Toronto
The oldest engineering school in Canada .
The thirdlargest in Canada for teaching and research.
The first in Québec for the student body size.
Operating budget $85 million Canadian Dollars (C$).
Annual research budget $60.5 million C$.
Annual grants and research contracts $38 million C$.
15 Industrial Research Chairs.
24 Canada Research Chairs.
7863 scientific publications over the last decade.
220 professors, and 1,100 employees.
1,000 graduates per year, and 30,000 since 1873.
Polytechnique
A simple approach to capacity
David Haccoun
Éric Roy, Christian Cardinal
Based on Differences Families
7
D
Dm
A
J
m
– Set of connection positions
– Number of connection positions
– Memory length
– Coding span
OneDimensional NCDO Codes
Information sequence
Shift register of length m
0 1 2 m1 m
...
...
D1
AWGN
Channel
=0
=m
... ...
... ...
J
Parity sequence
9
Differencesaredistinct
Example of Convolutional SelfOrthogonal Code
CSOC, R=1/2, J=4, m=15,
10
DistinctSimple Differences
11
12
D
D
S
>
<
D
D
D
Example of OneStep Threshold DecoderJ = 3, A= {0, 1, 3}, dmin= 4Soft outputs
in LLR
(2 1)=1
(3 0)=3
(3 1)=2
0
0
1
ûi
Decoded
bits
1 =0
3 =3
2 =1
= tanh/tanh1(sumproduct)or addmin (minsum) operator
, are LLRs values representing the received symbols ,
13
Issues : Search and determination of new CSO2Cs
Extention of Golomb rulers problem (unsolved)
14
Differences of Differences
(1,3,1,3)=((12)(12))= 24
(2,0,1,0)=((13) ( 3))= 16
(2,0,2,0)=((13)(13))= 26
(2,1,0,1)=((10)( 3))= 7
(2,1,2,0)=((10)(13))= 23
(2,1,2,1)=((10)(10))= 20
(2,3,0,1)=(( 2)( 3))= 5
(2,3,0,3)=(( 2)( 15))= 17
(2,3,1,0)=(( 2)( 3))= 1
(2,3,1,3)=(( 2)( 12))= 14
(2,3,2,0)=(( 2)(13))= 11
(2,3,2,1)=(( 2)(10))= 8
(2,3,2,3)=(( 2)( 2))= 4
(3,0,1,0)=((15)( 3))= 18
(0,1,0,1)=(( 3)( 3))= 6
(0,2,0,1)=((13)( 3))= 16
(0,2,0,2)=((13)(13))= 26
(0,3,0,1)=((15)( 3))= 18
(0,3,0,2)=((15)(13))= 28
(0,3,0,3)=((15)(15))= 30
(1,0,1,0)=(( 3)( 3))= 6
(1,2,0,2)=((10)(13))= 23
(1,2,1,0)=((10)( 3))= 7
(1,2,1,2)=((10)(10))= 20
(1,3,0,2)=((12)(13))= 25
(1,3,0,3)=((12)(15))= 27
(1,3,1,0)=((12)( 3))= 9
(1,3,1,2)=((12)(10))= 22
(3,0,2,0)=((15)(13))= 28
(3,0,3,0)=((15)(15))= 30
(3,1,0,1)=((12)( 3))= 9
(3,1,2,0)=((12)(13))= 25
(3,1,2,1)=((12)(10))= 22
(3,1,3,0)=((12)(15))= 27
(3,1,3,1)=((12)(12))= 24
(3,2,0,1)=(( 2)( 3))= 1
(3,2,0,2)=(( 2)( 13))= 11
(3,2,1,0)=(( 2)( 3))= 5
(3,2,1,2)=(( 2)( 10))= 8
(3,2,3,0)=(( 2)(15))= 17
(3,2,3,1)=(( 2)(12))= 14
(3,2,3,2)=(( 2)( 2))= 4
15
16
Approximate MAP valueli:
Extrinsic
Information
Received Inform. Symb.
=
+
: Addmin operator;
where
Decision rule :
, otherwiseûi= 0
ûi=1 if and only ifli
0
CSOC i is an equation of independent variables
17
Estimation of at Iteration
Feedforward for future symbols
Feedback for past symbols
depends on the simple differences
and on
the differences of differences
Iterative Threshold Decoding for CSO2Cs
General Expression:
Iterative Expressions:
18
Delay m
Delay m
Delay m
Delay m
Delay m
Delay m
ForwardOnly Iterative Decoder
Last Iteration
...
...
Soft
output
Soft output
Soft
output
Soft
output
Information
symbols
threshold
threshold
threshold
...
threshold
...
decoder
decoder
decoder
decoder
Iteration
Iteration
Iteration
Iteration
=1
=M
=2
=I
Hard Decision
From channel
Paritycheck
symbols
...
...
Decoded
Information
symbols
Features:
19
Latency m bits
Latency m bits
Latency m bits
Input
For
Output
For
Total Latency M mbits
20
p
p
w
w

t
Mm
t
u
u
w
w

t
Mm
DEC 1
DEC 2
DEC
M
t
0
l
l
(
2
)
(
M
)
ˆ
l
(
1
)
u




2
t
Mm
t
m
t
Mm
t
m
1
Threshold Decoder
(TH)
Latency
m bits
Latency
m bits
Latency
m bits
BP Decoder
p
p
w
w

t
Mm
t
u
w
u
w

t
Mm
DEC 1
DEC
2
DEC
M
t
(BP)
(
)
M
v
{
}
Latency m bits
Latency m bits

,
t
Mm
j
Latency m bits
0
l
l
(
2
)
(
M
)
ˆ
l
(
1
)
u




2
t
Mm
t
m
t
Mm
t
m
1
M(BP)~ ½M(TH)
BP Latency ~ ½ TH Latency
1step BP complexity ~ J X 1step TH complexity
21
J=9, A={0, 9, 21, 395, 584, 767, 871, 899, 912}
BP
Waterfall region
TH
Waterfall region
BP
Error floor region
TH
Error floor region
TH, 8 iterations
Both BP and TH decoding approach the asymptotic error performance in error floor region
BP, 8 iterations
BP, 4 iterations
22
23
Normalized simplification factor
,
Maximal number of distinct differences of differences
(excluding the unavoidable repetitions)
Number of repeated differences of differences
(excluding the unavoidable repetitions)
yielding value
24
CSO2C: A = { 0, 43, 139, 322, 422, 430, 441, 459 } SCSO2C: A = { 0, 9, 22, 55, 95, 124, 127, 129 }
27
8th iteration
CSO2C
BER
SCSO2C
14000
3000
Latency (x 104 bits)
Performance Comparison CSO2Cs / SCSO2Cs (TH Decoding)
28
Analysis of Orthogonality Properties (span)
Convolutional SelfOrthogonal Codes
(CSOC)
Simple
Orthogonality
Extension
Orthogonalproperties of set A
Convolutional SelfDoublyOrthogonal Codes (CSO2C)
Double
Orthogonality
Large Span
Relaxed Conditions
Relaxed Double
Orthogonality
Simplified CSO2C
(SCSO2C)
Substantial
Span Reduction
29
Decoded symbol
Independency VS Short cycles
(no descendant nodes).
LLR for final hard decision
Iter (1)
Iter (2)
30
Conditions
on associated sets
Cycles on Graphs
Codes
No 4cycles
Distinct differences
CSOC
Minimization of
Number of
6cycles
Uniformly Distributed
Distinct differences
from difference of differences
CSO2C
SCSO2C
Minimization of
Number of
8cycles
Distinct differences of differences
Uniformly Distributed
A Number of Additional
8cycles
A number of repetitions of differences of differences
Approximately Uniformly Distributed
31
Relaxing doubly orthogonal conditions of CSO2C adds some 8cycles leading to codes with substantially reduced coding spans SCSO2C
Eb/N0 values.
32
Extension : Recursive Convolutional DoublyOrthogonalCodes (RCDO)
Solution : Use Recursive
Convolutional Encoders
(RCDO)
33
2nd register
1st register
RCDO codesForward connections
Feedback connections
34
The memory of the RCDO encoder mis defined by the largest shift register of the encoder
Each line of HT(D)represents one output symbol of the encoder.
Each column of HT(D) represents one constraint equation.
Protograph representation of a RCDO codes is defined by HT(D).
The degree distributions of the nodes in the protograph become important in the convergence behavior of the decoding algorithm.
Regular RCDO (dv, dc) : dv = degree of variable (rows)
dc = degree of constraint (col.) (same numbers of nonzero elements of HT(D) )
RCDO protograph structureIrregular RCDO protograph
35
36
LDPC
n=1008
decoder limit’
RCDO (3,6)
1.10 dB
Increasing
number of shift registers
RCDO codes error performancesVS
number of shift registers
Coding rate15/30
15 registers
m = 149
Regular HT(D) (3,6)
40th Iteration
Close to optimal convergence behavior of the iterative decoder.
After 40 iterations
0.4 dB
Low error floor
RCDO codes error performances38
Pb = 105
CSO2C
good error performances at moderate SNR
RCDO
good error performance at low SNR
39
41