420 likes | 1.09k Views
TURBO CODES. Michelle Stoll. A Milestone in ECCs. Based on convolutional codes: multiple encoders used serially to create a codeword defined as triple ( n, k, m ) n encoded bits generated for every k data bits rec’d, where m represents the number of memory registers used. Enhancements.
E N D
TURBO CODES Michelle Stoll Turbo Codes
A Milestone in ECCs • Based on convolutional codes: • multiple encoders used serially to create a codeword • defined as triple (n, k, m) • n encoded bits generated for every k data bits rec’d, where m represents the number of memory registers used Turbo Codes
Enhancements • Added features include: • concatenated recursive systematic encoders • pseudo-random interleavers • soft input/soft output (SISO) iterative decoding Turbo Codes
Accolades • TCs nearly achieve Shannon’s channel capacity limit – first to get within 0.7 dB • Do not require high transmission power to deliver low bit error rate • Considered most powerful class of ECCs to-date Turbo Codes
Sidebar: Shannon Limit • Defines the fundamental transmission capacity of a communication channel • Claude Shannon from Bell Labs proved mathematically that totally random sets of codewords could achieve channel capacity, theoretically permitting error-free transmission Turbo Codes
Shannon Limit, con’t • Use of random sets of codewords not a practical solution • channel capacity can only be attained when k data bits mapped to n code symbols approach infinity • Cost of a code, in terms of computation required to decode it, increases closer to the Shannon limit • Coding paradox: find good codewords the deliver BERs close to the Shannon limit, but not overly complex • ECCs addressing both have been elusive for years • until advent of TCs, best codes were outside 2 dB of Shannon’s Limit “All codes are good, except the ones we can think of.” • Folk theorem Turbo Codes
Performance Bounds The performance floor is in the vicinity of a BER of 10-5 Turbo Codes
Turbo Code History • Claude Berrou, Alain Glavieux, and Punja Thitimajshima presented their paper “Near Shannon Limit Error-Correcting Coding and Decoding: Turbo Codes” in 1993 • Their results were received with great skepticism • in fact, the paper was initially rejected • independent researchers later verified their simulated BER performance Turbo Codes
Anatomy: Encoder • Two encoders, parallel concatenation of codes • can use the same clock, decreasing delay • dblocks of n bits length sent to each encoder • encoder 1 receives bits as-is, encodes the parity bits y1, and concatenates them with original data bits • encoder 2 receives pre-shuffled bit string from interleaver, encodes the parity bits y2 • multiplexer receives a string of size 3nof parity bits and original data bits from encoder 1, and parity bits from encoder 2 Turbo Codes
Turbo Encoder Schematic Example: original data = 01101 Encoder 1 creates parity bits 10110 and appends original 01101 Encoder 2 receives pre-shuffled bit string and create parity bits 11100 Multiplexer receives 011011011011100 Turbo Codes
Non-Uniform Interleaver • Irregular permutation map used to produce a pseudo-random interleaver – no block interleaving • Nonuniform interleaving assures a maximum scattering of data, introducing quasi-random behavior in the code • recall Shannon’s observation • Operates between modular encoders to permute all poor input sequences (low-weight CWs) into good input sequences producing large-weight output CWs Turbo Codes
Anatomy: Decoder • Decoder is most complex aspect of turbo codes • But imposes the greatest latency in the process as it is serial, iterative • Two constituent decoders are trying to solve the same problem from different perspectives • Decoders make soft decisions about data integrity, passing the extrinsic bit reliability information back and forth • Hence the name ‘turbo’ in reference to a turbo engine Turbo Codes
Decoder, con’t • Inspects analog signal level of the received bits • then turns the signal into integers which lend confidence to what the value should actually be • Next, examines parity bits and assigns bit reliabilities for each bit • Bit reliabilities are expressed as log likelihood ratios that vary between a positive and negative bound • in practice, this bound is quite large, between -127 and +127 • the closer the LLR is to one side, the greater the confidence assigned one way or the other. Turbo Codes
Pr {d = 1} received sequence L(d) = ln 1 - Pr {d = 1} Decoder, con’t:Log Likelihood Ratio (LLR) • The probability that a data bit d = 1, Pr {d = 1}, is expressed as: • What is passed from one decoder to the other are bit reliabilities • its computations with respect to the estimation ofd, without taking its own input into account • The input related to d is thus a single shared piece of information Turbo Codes
Decoder, con’t • Decoder modules dec1 and dec2 receive input • dec1 passes its bit reliability estimate to interleaver, dec2 • if dec1successful, it would’ve passed few or no errors to dec2 • Decoder module dec2 processes its input as well as the bit reliability from dec1 • refines the confidence estimate, then passes to de-interleaver • This completes the first iteration. • If no further refinements needed (i.e. acceptable confidence) the data is decoded and passed to upper layer • Otherwise, the output is passed back to dec1 for another iteration Turbo Codes
Turbo Decoder Schematic Turbo Codes
Decoding Drawbacks • To achieve near-optimum results, a relatively large number of decoding iterations are required (on the order of 10 to 20) • This increases computational complexity and output delay • one way to mitigate delay is to use a stop rule • Select some pre-determined number of interations to perform • if convergence is detected before the number is reached, stop Turbo Codes
Puncturing • Another way to address latency is through code puncturing • puncturing will change the code rate, k/n, without changing any of its attributes • instead of transmitting certain redundant values in the codeword these values are simply not transmitted • i.e. a Rate 1/2 code can be increased to a Rate 2/3 code by dropping every other output bit from the parity stream Turbo Codes
Complexity • Because the decoder is comprised of two constituent decoders, it is twice as complex as a conventional decoder when performing a single iteration • two iterations require twice the computation, rendering it four times as complex as a conventional decoder Turbo Codes
Latency • Latency on the decoding side is the biggest drawback of Turbo Codes • Decoding performance is influenced by three broad factors: interleaver size, number of iterations, and the choice of decoding algorithm • these can be manipulated, with consequences Turbo Codes
Ongoing Research • Turbo coding is responsible for a renaissance in coding research • Turbo codes, turbo code hybrids being applied to numerous problems • Multipath propagation • Low-density parity check (LDPC) • Software implementation! • turbo decoding at 300 kbits/second using 10 iterations per frame. With a stopping rule in place, the speed can be doubled or tripled Turbo Codes
Turbo Codes in Practice • Turbo codes have made steady inroads into a variety of practical applications • deep space • mobile radio • digital video • long-haul terrestrial wireless • satellite communications • Not practical for real-time, voice Turbo Codes
More Information • Excellent high-level overview: Guizzo, Erico. “Closing in on the Perfect Code”, IEEE Spectrum, March 2004. • Very informative four-part series on various aspects of TCs: Gumas, Charles Constantine. “Turbo Codes ev up error-correcting performance.” (part I in the series) EE Times Networkat http://archive.chipcenter.com/dsp/DSP000419F1.html • The paper that started it all: Berrou, Glavieux, and Thitimajshima. “Near Shannon Limit Error-Correcting Coding and Decoding: Turbo Codes” Ecole Superieure des Telecommunications de Bretagne, France. 1993. Complete bibliography soon available on my CS522 page Turbo Codes