1 / 26

Generalized Coset Codes for Symmetric/Asymmetric Distributed Source Coding

This paper proposes a constructive approach for distributed source coding using generalized coset codes, providing efficient encoding and decoding algorithms. Simulation results demonstrate the effectiveness of the proposed approach.

amaness
Download Presentation

Generalized Coset Codes for Symmetric/Asymmetric Distributed Source Coding

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Generalized Coset Codes for Symmetric/Asymmetric Distributed Source Coding S. Sandeep Pradhan Kannan Ramchandran {pradhan5, kannanr}@eecs.berkeley.edu BASiCS Group University of California at Berkeley

  2. Outline • Introduction and motivation • Preliminaries • Generalized coset codes for distributed source coding • Simulation results • Conclusions University of California, Berkeley

  3. Encoder Encoder Encoder Sensor 1 Sensor 2 Sensor 3 Application: Sensor Networks Joint Decoding Scene Channels are bandwidth or rate-constrained University of California, Berkeley

  4. Introduction and motivation Distributed source coding • Information theoretic results (Slepian-Wolf ‘73, Wyner-Ziv, ‘76) • Little is known about practical systems based on these elegant concepts • Applications: Distributed sensor networks/web caching, ad-hoc networks, interactive comm. Goal: Propose a constructive approach (DISCUS) (Pradhan & Ramchandran, 1999) University of California, Berkeley

  5. System 1 X Encoder Decoder • X and Y correlated • Y at encoder and decoder Y 0 0 0 0 0 1 0 1 0 1 0 0 Need 2 bits to index this. X+Y= Source Coding with Side Information at Receiver (illustration) • X and Y => length-3 binary data (equally likely), • Correlation: Hamming distance between X and Y is at most 1. Example: When X=[0 1 0], Y => [0 1 0], [0 1 1], [0 0 0], [1 1 0]. University of California, Berkeley

  6. Y Encoder Decoder Y 0 0 0 1 1 1 000 001 010 100 111 110 101 011 Coset-1 X System 2 X • X and Y correlated • Y at decoder • What is the best that one can do? • The answer is still 2 bits! How? University of California, Berkeley

  7. Coset-2 Coset-1 Coset-3 Coset-4 • Encoder -> index of the coset containing X. • Decoder -> X in given coset. • Note: • Coset-1 -> repetition code. • Each coset -> unique “syndrome” • DIstributed Source Coding Using Syndromes University of California, Berkeley

  8. Encoder Decoder Symmetric CodingX and Y both encode partial information • Example: • X and Y -> length-7 equally likely binary data. • Hamming distance between X and Y is at most 1. • 1024 valid X,Y pairs • Solution 1: • Y sends its data with 7 bits. • X sends syndromes with 3 bits. • { (7,4) Hamming code } -> Total of 10 bits • Can correct decoding be done if X and Y send 5 bits each ? Y University of California, Berkeley

  9. 32 . . . 2 1 1 2 3 . . . 32 Coset Matrix • Solution 2: Map valid (X,Y) pairs into a coset matrix Y X • Construct 2 codes, assign them to • encoders • Encoders -> index of coset of • codes containing the outcome University of California, Berkeley

  10. 1 0 1 1 0 1 0 0 1 0 0 1 0 1 0 1 1 0 0 1 0 1 1 1 0 0 0 1 G1 = G2 = Theorem 1: With (n,k,2t+1) code, X and Y -> rate pairs (R1,R2) : G = 1 0 1 1 0 1 0 0 1 0 0 1 0 1 0 1 1 0 0 1 0 1 1 1 0 0 0 1 Example This concept can be generalized to Euclidean-space codes. University of California, Berkeley

  11. 7 6 5 4 3 3 4 5 6 7 Achievable Rate Region for the Problem The rate region is: • All 5 optimal points can be • constructively achieved with the • same complexity. • An alternative to source-splitting • approach (Rimoldi-97) University of California, Berkeley

  12. Example: -5.5 -4.5 -3.5 -2.5 -1.5 -0.5 0.5 1.5 2.5 3.5 4.5 5.5 S -4.5 -2.5 -0.5 1.5 3.5 5.5 Generalized coset codes: (Forney, ’88) • S = lattice • S’=sublattice • Construct sequences of cosets of S’ in S in n-dimensions S’ University of California, Berkeley

  13. 0 0 0 0 1 0 1 1 C = -5.5 -3.5 -1.5 0.5 2.5 4.5 -4.5 -4.5 -4.5 -2.5 -2.5 -2.5 -0.5 -0.5 -0.5 1.5 1.5 1.5 3.5 3.5 3.5 5.5 5.5 5.5 Example: Let n=4 4-d Euclidean space code c=1011 1 0 1 1 -2.5 2.5 -0.5 -4.5 sequence coming from the above sets -> valid codeword sequence University of California, Berkeley

  14. Generalized coset codes for distributed source coding 1 3 5 7 9 -5 13 -17 -23 -11 19 25 x x x x x x x x x x x x x x x x x x x x x x x x x 1 7 -5 13 -17 -11 19 25 6 Two-level hierarchy of subcode construction: 1 -17 19 Subset -> encoder 1 1 7 13 Subset -> encoder 2 University of California, Berkeley

  15. Example 2: University of California, Berkeley

  16. University of California, Berkeley

  17. is a sublattice of University of California, Berkeley

  18. is the set of coset representatives of in University of California, Berkeley

  19. 1 1 1 2 2 3 3 4 Encoders -> index of subsets in dense lattice L, containing quantized codewords University of California, Berkeley

  20. Encoding: • Encoders quantize with main lattice • Index of the coset of subsets in the main lattice is sent Decoding: • Decoder -> pair of codewords in the given coset pairs • Estimate the source Similar subcode construction for generalized coset code Computationally efficient encoding and decoding Theorem 2: Decoding complexity = decoding a codeword in University of California, Berkeley

  21. 1 1 1 2 2 3 3 4 Correlation distance • dc => second minimum distance between 2 codevectors in coset pairs i,j • Decoding error => distance between quantized codewords > dc. Theorem 3: dmin => min. distance of the code University of California, Berkeley

  22. Simulation Results:Trellis codes Model: Source = X~ i.i.d. Gaussian , Observation= Y i= X+Ni, where Ni ~ i.i.d. Gaussian. Correlation SNR= ratio of variances of X and N. Effective Source Coding Rate = 2bit / sample/encoder. Quantizers: Fixed-length scalar quantizers with 8 levels. Trellis codes with 16- states based on 8 level root scalar quantizer University of California, Berkeley

  23. Results Prob. of decoding error Same prob. of decoding error for all the rate pairs University of California, Berkeley

  24. Distortion Performance: Attainable Bound: C-SNR=22 dB, Normalized distortion: -15.5 dB University of California, Berkeley

  25. Encoder-2 Special cases: 2. Lattice codes Hexagonal Lattice Encoder-1 University of California, Berkeley

  26. Conclusions • Proposed constructive framework for distributed source coding -> arbitrary achievable rates • Generalized coset codes for framework • Distance properties & complexity -> same for all achievable rate points • Trellis & lattice codes -> special cases • Simulations University of California, Berkeley

More Related