1 / 38

Towards Practical Distributed Coding

Towards Practical Distributed Coding. Bernd Girod Information Systems Laboratory Stanford University. Outline. Distributed lossless compression Simple examples Slepian-Wolf Theorem Slepian-Wolf coding vs. systematic channel coding Turbo codes for compression

nedaa
Download Presentation

Towards Practical Distributed Coding

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Towards Practical Distributed Coding Bernd Girod Information Systems Laboratory Stanford University

  2. Outline • Distributed lossless compression • Simple examples • Slepian-Wolf Theorem • Slepian-Wolf coding vs. systematic channel coding • Turbo codes for compression • Lossy compression with side information • Wyner-Ziv Theorem • Optimal quantizer design with Lloyd algorithm • Selected applications • Sensor networks • Low complexity video coding • Error-resilient video transmission

  3. Source B Source statistics exploited in the encoder.Different statistics Different code. Simple Example Source A

  4. Source B Different statistics Same code. Source statistics exploited in the decoder. “Lossless” compression with residual error rate. Simple Example - Revisited Source A

  5. Compression with Side Information Source A/B Decoder Encoder

  6. Distributed Compression of Dependent Sources Encoder X Source X X Joint Decoder Y Source Y Encoder Y

  7. Separate encodingand decoding of X and Y Separate encodingand joint decoding of X and Y Achievable Rates for Distributed Coding Example

  8. No errors Vanishing error probabilityfor long sequences General Dependent i.i.d. Sequences [Slepian, Wolf, 1973]

  9. Idea Interpret Y as a “noisy” version of X with “channel errors” D Encoder generates “parity bits” P to protect against errors D Decoder concatenates Y and P and performs error-correcting decoding Distributed Compression and Channel Coding Source X|Y Decoder Encoder P

  10. Practical Slepian-Wolf Encoding • Coset codes [Pradhan and Ramchandran, 1999] • Trellis codes [Wang and Orchard, 2001] • Turbo codes [Garcia-Frias and Zhao, 2001] [Bajcsy and Mitran, 2001] [Aaron and Girod, 2002] • LDPC codes [Liveris, Xiong, and Georghiades, 2002]

  11. bits bits Interleaverlength L Compression by Turbo Coding L bits in L bits Systematic Convolutional Encoder Rate Systematic Convolutional EncoderRate L bits [Aaron, Girod, DCC 2002]

  12. “Channel” probabilities calculations “Channel” probabilities calculations bits in bits in Interleaverlength L Decision Interleaverlength L Deinterleaverlength L Deinterleaverlength L Turbo Decoder Pchannel SISO Decoder Pa posteriori Pa priori Pextrinsic L bits out Pextrinsic Pa priori SISO Decoder Pchannel Pa posteriori [Aaron, Girod, DCC 2002]

  13. X,Y dependent binary sequences with symmetric cross-over probabilities Rate 4/5 constituent convolutional codes; RX=0.5bit per input bit 0.15 bit Results for Compression of Binary Sequences [Aaron, Girod, DCC 2002]

  14. Results with Puncturing [Aaron, Girod, DCC 2002]

  15. Systematic Channel Coding Systematic bits and parity bits subject to bit-errors Mostly memoryless BSC channel Low bit-error rate of channel Rate savings relative to systematic bits Slepian-Wolf Coding No bit errors in parity bits General statistics incl. memory Whole range of “error” probabilities Rate savings relative to parity bits Might have to compete with conventional compression (e.g., arithmetic coding) Channel Coding vs. Slepian-Wolf Coding

  16. Achievable rate region Distributed Lossy Compression of Dependent Sources Encoder X Source X X’ Joint Decoder Y’ Source Y Encoder Y

  17. [Wyner, Ziv, 1976] [Zamir,1996] Lossy Compression with Side Information Source Decoder Encoder

  18. Practical Wyner-Ziv Encoder and Decoder Wyner-Ziv Decoder Wyner-Ziv Encoder Slepian- Wolf Decoder Minimum Distortion Reconstruction Slepian-Wolf Encoder Quantizer

  19. Non-Connected Quantization Regions • Example: Non-connected intervals for scalar quantization • Decoder: Minimum mean-squared error reconstruction with side information x x

  20. Quantizer Reconstruction Function

  21. 1 3 4 2 q = 1 2 3 4 1 2 x Finding Quantization Regions

  22. Update rate measure for current quantizers Y Convergence End Lloyd Algorithm for Wyner-Ziv Quantizers Choose initial quantizers Find best reconstruction functions for current quantizers [Fleming, Zhao, Effros, unpublished] [Rebollo-Monedero, Girod, DCC 2003] Lagrangian cost for current quantizers, reconstructor and rate measure N Find best quantizers for current reconstruction and rate measure

  23. Conditional entropy coder H(Q|Y) [Rebollo-Monedero, Girod, DCC 2003] Which Rate Measure? Wyner-Ziv Decoder Wyner-Ziv Encoder Slepian- Wolf Decoder Minimum Distortion Reconstruction Slepian-Wolf Encoder Quantizer

  24. Example Data set: Video sequence Carphone, 100 luminance frames, QCIF • X:pixel values in even frames • Y:motion-compensated interpolation from two adjacent odd frames

  25. Example (cont.) Quantizer w/ rate constraint H(Q)Quantizer w/ rate constraint H(Q|Y) PSNR=37.4 dBH(Q)=1.87 bit H(Q|Y)=0.54 bit PSNR=39 dBH(Q)=3.05 bit H(Q|Y)=0.54 bit

  26. Example (cont.) Quantizer w/ rate constraint H(Q)Quantizer w/ rate constraint H(Q|Y) PSNR[dB] PSNR[dB] Rate [bit] Rate [bit]

  27. Wyner-Ziv Quantizers: Lessons Learnt • Typically no quantizer index reuse for rate constraint H(Q|Y) and high rates: Slepian-Wolf code provides more efficient many-to-one mapping in very high dimensional space. • Uniform quantizers close to minimum m.s.e., when combined with efficient Slepian-Wolf code • Quantizer index reuse required for rate constraint H(Q) and for fixed-length coding • Important to decouple dimension of quantizer (i.e. scalar) and Slepian-Wolf code (very large)

  28. Remote Sensor Remote Sensor Remote Sensor Remote Sensor Sensor Networks Local Sensor Central Unit Side Information [Pradhan, Ramchandran, DCC 2000] [Kusuma, Doherty, Ramchandran, ICIP 2001] [Pradhan, Kusuma, Ramchandran, SP Mag., 2002] [Chou, Perovic, Ramchandran, Asilomar 2002]

  29. Interframe Decoder Intraframe Encoder Slepian-Wolf Codec Video frame Reconstruction Turbo Decoder Turbo Encoder Scalar Quantizer Buffer X’ X Request bits Y previous Interpolation Key frames next Video Compression with Simple Encoder [Aaron, Zhang, Girod, Asilomar 2002] [Aaron, Rane, Zhang, Girod, DCC 2003]

  30. Video Compression with Simple Encoder After Wyner-Ziv Decoding Decoder Side information 16-level quantization (~1 bpp)

  31. Video Compression with Simple Encoder After Wyner-Ziv Decoding Decoder Side information 16-level quantization (~1 bpp)

  32. Video Compression with Simple Encoder After Wyner-Ziv Decoding Decoder Side information 16-level quantization (~1 bpp)

  33. 7 dB Performance of Simple Wyner-Ziv Video Coder

  34. Side info Digital Channel Wyner- Ziv Encoder Wyner- Ziv Decoder Digitally Enhanced Analog Transmission • Forward error protection of the signal waveform • Information-theoretic bounds [Shamai, Verdu, Zamir,1998] • “Systematic lossy source-channel coding” Analog Channel

  35. Wyner-Ziv Decoder A Wyner-Ziv Encoder A S* Wyner-Ziv Decoder B Wyner-Ziv Encoder B S** Forward Error Protection for MPEG Video Broadcasting MPEG Encoder MPEG Decoder with Error Concealment Graceful degradation without a layered signal representation S S’ Error-Prone channel

  36. Error-resilient Video Transmissionwith Embedded Wyner-Ziv Codec Carphone CIF, 50 frames @ 30fps, 1 Mbps, 1% Random Macroblock loss [Aaron, Rane, Rebollo-Monedero, Girod, ICIP 2003]

  37. Towards Practical Distributed Coding:Why Should We Care? • Chance to reinvent compression from scratch • Entropy coding • Quantization • Signal transforms • Adaptive coding • Rate control • . . . • Enables new compression applications • Sensor networks • Very low complexity encoders • Error-resilient transmission of signal waveforms • Digitally enhanced analog transmission • Unequal error protection without layered coding • . . .

  38. The End

More Related