1 / 28

Distributed Source Coding

Distributed Source Coding. Trial Lecture Fredrik Hekland 1. June 2007. Outline. Concept of DSC Slepian-Wolf coding (lossless) Wyner-Ziv coding (lossy) Application areas. Distributed Source Coding - Sensor Networks. Entropy H(X). Conditional entropy H(Y|X). Mutual Information

darlita
Download Presentation

Distributed Source Coding

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Distributed Source Coding Trial Lecture Fredrik Hekland 1. June 2007

  2. Outline • Concept of DSC • Slepian-Wolf coding (lossless) • Wyner-Ziv coding (lossy) • Application areas

  3. Distributed Source Coding - Sensor Networks

  4. Entropy H(X) Conditional entropy H(Y|X) Mutual Information I(X;Y) Joint Entropy H(X,Y) Correlated Sources

  5. Co-located, Correlated observations • Xand Ycorrelated • Both encoder and decoder know the correlation R = H(X,Y)= H(Y) + H(X|Y) < H(X) + H(Y)

  6. Distributed, but Correlated Observations • X and Y spatially separated, but still correlated • Informed encoders • Rate: R=RX+RY=H(X,Y)=H(Y)+H(X|Y) • Uninformed, naive encoders • Rate: R=RX+RY=H(X)+H(Y) > H(X,Y)

  7. Slepian-Wolf Coding (SWC) • X and Y spatially separated, but still correlated • Encoder/decoder designed w.r.t. p(X,Y) • No communication between encoders! R =RX+RY= H(X,Y) = H(Y) + H(X|Y) still possible!!

  8. Code X with Y as side-information No errors Vanishing error probability for long sequences Time-sharing/ Source splitting/ Code partitioning Code Y with X as side-information Achievable Rate Region - SWC Slepian & Wolf, “Noiseless Coding of Correlated Information Sources,” IEEE Trans. Inf.Theory, Jul.1973

  9. Apply 2nH(X|Y) colors randomly Principle - SWC 2nH(Y) codewords Xn Yn RX = nH(X|Y) RY = nH(Y)

  10. Coset Toy Example – Binary Source • X and Y each 3 bits • X and Y differs at most in one bit • Make sets of X’s with Hamming distance 3: • X: {000,111}, {100,011}, {010,101}, {001,110} • Send index of set (requires 2 bits) • Send Y(requires 3 bits) • Decode X by using the element in the set which is closest to Y • Declare error if no element with dH≤1

  11. SWC design • Proof in Slepian&Wolf’s article “non-constructive” • Important realization: SWC is a channel coding problem • “Virtual” correlation channel between X and Y • A good channel code for this channel can provide a good SW code by using coset codes as bins

  12. Wyner’s Scheme • Use a linear block code, send syndrome • (n,k) block code, 2(n-k) syndromes, each corresponding to a set of 2kwords of length n. • Each set is a coset code. • Compression ratio of n:(n-k). A. Wyner, “Recent Results in the Shannon Theory,” IEEE Trans. Inf.Theory, Jan.1974

  13. Practical SWC Design • Use more powerful channel codes • LDPC / Turbo codes • Send parity bits • Zhao & Garcia-Frias, “Data compression of correlated non-binary sources using punctured turbo codes”, DCC’02 • Or send syndrome • Liveris et al., “Compression of binary sources with side-information at the decoder using LDPC codes,” IEEE Commun.Lett. vol.6, no.10, 2002

  14. SWC using LDPC codes Xiong et al., “Distributed Source Coding for Sensor Networks,” IEEE Sig.Proc.Mag., Sept. 2004

  15. Continuous Sources – Wyner-Ziv Coding (WZC) • Generalizes SWC by introducing a fidelity criterion • A joint source-channel coding problem • We need • Good source coder to achieve the source coding gains (e.g.TCQ) • Good channel code which approaches Slepian-Wolf limit (LDPC)

  16. Wyner-Ziv Rate-Distortion Function under the following conditions:

  17. Distributed Source Coding Using Syndromes (DISCUS) • First constructive design approach for WZC • Trellis-based quantization and coset construction. • 2-5 dB away from WZ-bound • [Yang et al. ’03]: SWC-TCVQ • Irregular LDPC, n=106 • 2-D TCVQ • Quadratic Gaussian: 0.47 dB away for 3.3 bit/sym Pradhan & Ramchandran,“Distributed Source Coding Using Syndromes (DISCUS): Design and Construction,” Data Compression Conf. (DCC), 1999

  18. Other Approaches to Lossy DSC • Distributed Karhunen-Loève transform • Local minima  • Distributed scalar quantizers optimized for noisy channels • Simpler encoder  • Local minima 

  19. Application Areas • Sensor networks • Multimedia transmission • Robust coding for co-located sources • Digitally enhanced analog TV • Multiple description coding • Data hiding / watermarking • Coding for multiple access channels • MIMO broadcast channel • Searchable compression (…)

  20. Sensor Networks • Possible rate savings with WZC  • Hard to find correlation model  • Can be determined through training • But what about time-varying correlation?

  21. Wyner-Ziv for Video Compression (1/3) • MPEG: High encoder complexity • Portables: Less powerful hardware • Solution: Wyner-Ziv video coding • Shifts complexity to the decoder • Transcoding to MPEG provides simple decoder for receiver

  22. Wyner-Ziv for Video Compression (2/3)

  23. Wyner-Ziv for Video Compression (3/3) Girod et al.,“Distributed Video Coding,” Proc. IEEE, Jan.2005

  24. Digitally Enhanced Analog TV

  25. Watermarking • “Hide” a message W inside a host X • A dual problem to DSC • Channel coding with side-information at encoder • Attacker tries to remove/destroy watermark W • Source X must be preserved For AWGN attack, knowledge of X only at the encoder is as good as knowing X at both encoder and decoder. Costa," Writing on Dirty Paper,” IEEE Trans.Inf.Theory, May 1983

  26. Complexity at transmitter Complexity at receiver MIMO Broadcast Channel • Non-degraded broadcast channel • Cannot use superposition coding with successive decoding • Related to watermarking: Dirty paper coding! • Costa’s “writing on dirty paper” scheme • Adapt to interference, don’t try to cancel it • User 1’s signal hosts, insert “watermark” as message to User 2

  27. Summary • Distributed Source Coding • Enables compression of correlated, spatially separated sources • Slepian-Wolf Coding: Lossless • Wyner-Ziv Coding: Lossy • Other uses • Multimedia • Watermarking • Multiple access / broadcast channels / MIMO

  28. Further Reading Slepian & Wolf, “Noiseless Coding of Correlated Information Sources,” IEEE Trans.Inf.Theory, Jul. 1973 Wyner & Ziv, “The Rate-Distortion Function for Source Coding with Side Information at the Decoder,” IEEE Trans.Inf.Theory, Jan. 1976 Pradhan & Ramchandran,“Distributed Source Coding Using Syndromes (DISCUS): Design and Construction,” IEEE Trans.Inf.Theory, Mar.2003 Pradhan et al., “Distributed Compression in a Dense Microsensor Network,” IEEE Sig.Proc.Mag., Mar.2002. Xiong et al., “Distributed Source Coding for Sensor Networks,” IEEE Sig.Proc.Mag., Sept. 2004 Yang et al. “Wyner-Ziv Coding Based on TCQ and LDPC Codes”, 37th Asilomar Conference on Sig.,Sys.and Comp. 2004 Girod et al.,“Distributed Video Coding,” Proc. IEEE, Jan.2005 Cox et al.,”Watermarking as Communications with Side Information,” Proc. IEEE, Jul. 1999

More Related