1 / 18

Compression with Side Information using Turbo Codes

Compression with Side Information using Turbo Codes . Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference April 3, 2002. Overview . Introduction Turbo Coder and Decoder Compression of Binary Sequences

vilmaris
Download Presentation

Compression with Side Information using Turbo Codes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Compression with Side Information usingTurbo Codes Anne Aaron and Bernd Girod Information Systems Laboratory Stanford University Data Compression Conference April 3, 2002

  2. Overview • Introduction • Turbo Coder and Decoder • Compression of Binary Sequences • Extension to Continuous-valued Sequences • Joint Source-Channel Coding • Conclusion Compression with Side Information Using Turbo Codes April 3, 2002

  3. Encoder Encoder Statistically dependent Decoder Slepian-Wolf Theorem Distributed Source Coding Compression with Side Information Using Turbo Codes April 3, 2002

  4. Research Problem • Motivation • Slepian-Wolf theorem: It is possible to compress statistically dependent signals in a distributed manner to the same rate as with a system where the signals are compressed jointly. • Objective • Design practical codes which achieve compression close to the Slepian-Wolf bound Compression with Side Information Using Turbo Codes April 3, 2002

  5. Encoder Statistically dependent Decoder Asymmetric Scenario – Compression with Side Information • Compression techniques to send at rate close to H(Y) are well known • Can perform some type of switching for more symmetric rates Compression with Side Information Using Turbo Codes April 3, 2002

  6. Our Approach: Turbo Codes • Turbo Codes • Developed for channel coding • Perform close to Shannon channel capacity limit (Berrou, et al., 1993) • Similar work • Garcia-Frias and Zhao, 2001 (Univ. of Delaware) • Bajcsy and Mitran, 2001 (McGill Univ.) Compression with Side Information Using Turbo Codes April 3, 2002

  7. Encoder Statistically dependent Decoder System Set-up • X and Y are i.i.d binary sequences X1X2…XL and Y1Y2…YL with equally probable ones and zeros. Let Xi be independent of Yj for ij, but dependent on Yi. X and Y dependency described by pmf P(x|y). • Y is sent at rate RYH(Y) and is available as side information at the decoder Compression with Side Information Using Turbo Codes April 3, 2002

  8. L bits in L bits Systematic Convolutional Encoder Rate Systematic Convolutional Encoder Rate Discarded bits bits Interleaverlength L Discarded L bits Turbo Coder Compression with Side Information Using Turbo Codes April 3, 2002

  9. Channel probabilities calculations Channel probabilities calculations bits in bits in Decision Interleaverlength L Interleaverlength L Deinterleaverlength L Deinterleaverlength L Turbo Decoder Pchannel SISO Decoder Pa posteriori Pa priori Pextrinsic L bits out Pextrinsic Pa priori SISO Decoder Pchannel Pa posteriori Compression with Side Information Using Turbo Codes April 3, 2002

  10. Simulation: Binary Sequences • X-Y relationship • P(Xi=Yi)=1-p and P(XiYi)=p • System • 16-state, Rate 4/5 constituent convolutional codes; • RX=0.5 bit per input bit with no puncturing • Theoretically, must be able to send X without error when H(X|Y)0.5 Compression with Side Information Using Turbo Codes April 3, 2002

  11. Results: Compression of Binary Sequences RX=0.5 0.15 bit Compression with Side Information Using Turbo Codes April 3, 2002

  12. Results for different rates • Punctured the parity bits to achieve lower rates Compression with Side Information Using Turbo Codes April 3, 2002

  13. Turbo Coder Turbo Coder L values ML bits L symbols Interleaverlength L Quantize to 2M levels Convert to bits Convert to bits ML bits Extension to Continuous-Valued Sequences • X and Y are sequences of i.i.d continuous-valued random variables X1X2…XL and Y1Y2…YL. Let Xi be independent of Yj for ij, but dependent on Yi. X and Y dependency described by pdf f(x|y). • Y is known as side information at the decoder To decoder Compression with Side Information Using Turbo Codes April 3, 2002

  14. Simulation: Gaussian Sequences • X-Y relationship • X is a sequence of i.i.d Gaussian random variables • Yi=Xi+Zi, where Z is also a sequence of i.i.d Gaussian random variables, independent of X. f(x|y) is a Gaussian probability density function • System • 4-level Lloyd-Max scalar quantizer • 16-state, rate 4/5 constituent convolutional codes • No puncturing so rate is 1 bit/source sample Compression with Side Information Using Turbo Codes April 3, 2002

  15. Results: Compression of Gaussian Sequences RX=1 bit/sample 2.8 dB CSNR = ratio of the variance of X and Z Compression with Side Information Using Turbo Codes April 3, 2002

  16. Joint Source-Channel Coding • Assume that the parity bits pass through a memoryless channel with capacity C • We can include the channel statistics in the decoder calculations for Pchannel. • From Slepian-Wolf theorem and definition of Channel capacity Compression with Side Information Using Turbo Codes April 3, 2002

  17. Results: Joint Source-Channel Coding RX=0.5 BSC with q=0.03 0.15 bit 0.12 bit Compression with Side Information Using Turbo Codes April 3, 2002

  18. Conclusion • We can use turbo codes for compression of binary sequences. Can perform close to the Slepian-Wolf bound for lossless distributed source coding. • We can apply the system for compression of distributed continuous-valued sequences. Performs better than previous techniques. • Easy extension to joint source-channel coding Compression with Side Information Using Turbo Codes April 3, 2002

More Related