1 / 19

Distributed Compression For Binary Symetric Channels

Distributed Compression For Binary Symetric Channels. Kivanc Ozonat. Introduction. Description of the Problem Slepian-Wolf Theorem Prior Work Basic Encoder-Decoder Scheme Methodology Results. Problem Description.

Download Presentation

Distributed Compression For Binary Symetric Channels

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Distributed CompressionFor Binary Symetric Channels Kivanc Ozonat Distributed Compression For Binary Symetric Channels

  2. Introduction • Description of the Problem • Slepian-Wolf Theorem • Prior Work • Basic Encoder-Decoder Scheme • Methodology • Results Distributed Compression For Binary Symetric Channels

  3. Problem Description • Given two correlated data sets, a noisy version, X , at the decoder and the original, Y, at the encoder, how to transmit Y with the best coding efficiency? • No communication of X and Y at the encoder Encoder Decoder Y X Distributed Compression For Binary Symetric Channels

  4. Slepian-Wolf Theorem • Slepian-Wolf : Given the following scheme, (X,Y) (X,Y) R1 Encode X Encode Y R2 Distributed Compression For Binary Symetric Channels

  5. Slepian-Wolf Theorem • Can transmit X and Y, if: - R1 > H(X|Y) , R2 > H(Y|X), and - R1+ R2 > H(X,Y). R2 H(Y) H(Y|X) R1 H(X|Y) H(X) Distributed Compression For Binary Symetric Channels

  6. Slepian-Wolf Theorem • Our problem is a special case of this: R2 H(Y) H(Y|X) H(Y|X) R1 H(X|Y) H(X) H(X) Distributed Compression For Binary Symetric Channels

  7. Prior Work Bin 1 [0 0 0] [1 1 1] [0 10] [1 0 1] Bin 2 [0 0 1] [1 1 0] [0 1 0] Bin 3 [0 10] [1 0 1] Bin 4 Y = [0 1 1] [0 1 1] [1 0 0] Channel Decoder Encoder Distributed Compression For Binary Symetric Channels

  8. Prior Work How to maximally separate “very long” input sequences? Use error-correcting codes. Distributed Compression For Binary Symetric Channels

  9. Prior Work 1-p 0 0 p with EQUAL input probabilities of 0 and 1. p 1 1 1-p by Ramchandran, Pradhan. Distributed Compression For Binary Symetric Channels

  10. Prior Work What if the input probabilities are NOT EQUAL? Distributed Compression For Binary Symetric Channels

  11. Methodology Plane 1 Bit Plane 1 Huffman Code The Input Sequence X Form the Bins using Error Correcting Codes Decoder Bit Plane 2 Plane 2 Plane N Bit Plane N Y sequence Distributed Compression For Binary Symetric Channels

  12. Encoder Inputs: 0 (with probability .7) and 1 (with probability .3) Huffman code 2-length sequences: 00  0 (with probability .49) 01  10 (with probability .21) 10  110 (with probability .21) 11  111 (with probability .09) Bit-Plane 1: 0, 1 , 1 ,1 Bit-Plane 2: -, 0 , 1 ,1 Bit-Plane 3: - , - , 0 ,1 Distributed Compression For Binary Symetric Channels

  13. Encoder [001001] [00], [10], [01] Error Control Coding To Form Bins 011 [0], [110], [10] -10 -0- Distributed Compression For Binary Symetric Channels

  14. Decoder • Decoder receives a BIN NUMBER, which corresponds to MULTIPLE CODEWORDS. • How to select the “correct codeword” out of these multiple codewords? • Use MAXIMUM LIKELIHOOD detection. Distributed Compression For Binary Symetric Channels

  15. Decoder [011] Decoder Bin 4 [011] [110] This is what the decoder receives Huffman codes for 2 length sequences [z1 z2 z3] Assume Y= [01, 11, 10] Compute the probability of [z1 z2 z3] given 01,11,10, using the channel error probability. Distributed Compression For Binary Symetric Channels

  16. Parameters Plane 1 Bit Plane 1 Huffman Code The Input Sequence X Form the Bins using Error Correcting Codes Decoder Bit Plane 2 Plane 2 Plane N Bit Plane N Length 4 Use BCH (15,k) Y sequence Distributed Compression For Binary Symetric Channels

  17. Bit Rate vs. Probability of Occurrence of 0’s(at the fixed error rate p of 0.06) Distributed Compression For Binary Symetric Channels

  18. Difference between the Actual Bit Rate and the Slepian-Wolf Bound vsError Probability (p) Distributed Compression For Binary Symetric Channels

  19. Conclusions • Huffman Code is not a very good choice • Better error correcting codes can be selected. • Gives good results for low error (p) cases and for cases in which the Huffman code gives nearly equal distribution of 0s and 1s. Distributed Compression For Binary Symetric Channels

More Related