1 / 37

Unexplored Connections in Information Theory and Coding

This paper explores the unexplored connections between multiple description source coding and distributed source coding, providing new rate regions and constructions for achieving better coding performance.

rnewberry
Download Presentation

Unexplored Connections in Information Theory and Coding

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Multiple Description Coding and Distributed Source Coding:Unexplored Connections in Information Theory and Coding Theory S. Sandeep Pradhan Department of EECS University of Michigan, Ann Arbor (joint work with R. Puri and K. Ramchandran of University of California, Berkeley)

  2. Transmission of sources over packet networks 1 2 ^ Packet Erasure Network X X Encoder Decoder n • Best Effort Networks : modeled as packet erasure channels. • User Datagram Protocol (UDP) • Example: Internet • Multimedia over the Internet is growing fast

  3. Multiple Descriptions Source Coding Distortion D1 R1 Side Decoder 1 Description 1 X D0 MD Encoder Central Decoder Side Decoder 2 Description 2 Distortion D2 R2 Find the set of all achievable tuples (R1,R2,D1,D2,D0)

  4. Information Theory: (incomplete list) Prior Work • Cover-El Gamal, ‘80: achievable rate region for 2-channel MD. • Ozarow 1981: Converse for Gaussian sources. • Berger, Ahlswede, Zhang, ’80-’90. • Venkataramani et al, ‘01: extension of cover-el gamal region for n-cannels Finite-block-length codes: (incomplete list) • Vaishampayan ’93: MD scalar and vector quantizers • Wang Orchard-Reibman `97: MD transform codes • Goyal-Kovacevic `98: frames for MD • Puri Ramchandran `01: FEC for MD

  5. Main idea in random codes for 2-channel MD(Cover-El Gamal) Fix: p(x1) p(x2) Find a pair of codewords that that jointly typical with source word with respect to p(x,x1,x2) Possible if:

  6. Possible ideas for n-channel MD ? • Extend Cover-El Gamal random codes from 2 to n: (Venkataramani et al.) • Use maximum distance separable erasure (MDS) codes (Albanese et al., ‘95)

  7. Distortion 0 1 k n (# of Packets) “Cliff” effect Erasure codes • Erasure Codes (n, k, d) : Add (n-k) parity symbols • MDS Codes : d = n – k + 1 • MDS => anyk channel symbols => k source symbols. ENC DEC C H A N N E l Source A subset Source Packets

  8. Distortion 1 2 3 Fix: Use many MDS codes Example for 3-channels: (Albanese et al ’95, Puri-Ramchandran 99) Successively refinable source-encoded bit stream Description 1 Description 2 Description 3 (3,1) (3,2) (3,3) MDS erasure codes

  9. Outline of our strategy: • Start from an MDS erasure code from a different perspective • Connect this code to a distributed source coding problem • Construct random codes based on this connection: (n,k) source-channel erasure codes • New rate region for MD: A concatenation of these (n,k) source-channel erasure codes. What is new in our work? • Symmetric problem, # of descriptions > 2 • Explore a fundamental connection between MD coding and distributed source coding. • New rate region for MD: random binning inspired from distributed source coding • Constructions for MD: extension of our earlier work (DISCUS) on construction of coset codes for distributed source coding.

  10. 1 2 3 n - 1 n Idea #1: A new look at (n,1,n) MDS codes • (n, 1, n) “bit” code • All packets are identical (repetition) • Reception of any one packet enables reconstruction • Reception of more than one packet does not give better quality • Parity bits wasted…

  11. 1 2 3 n - 1 n Idea #1 (contd): (n,1,n) source-channel erasure code Independently quantized versions of X on every packet Reception of any one packet enables reconstruction Reception of more packets enables better reconstruction (estimation gains due to multiple looks!)

  12. Extensions to (n,k) source-channel codes • Can we generalize this to (n,k) source-channel codes? • Yes: random binning (coset code) approach ! • Using Slepian-Wolf, Wyner-Ziv Theorems A Conceptual leap using binning (n,1) code (n,k) code

  13. Idea # 2: Consider a (3,2,2) MDS code There is inherent uncertainty at the encoder about which packets are received by the decoder. Needs coding strategy where decoder has access to some information while the encoder does not: distributed source coding

  14. Background: Distributed source coding(Slepian-Wolf ‘73, Wyner-Ziv ‘76, Berger ‘77) • Exploiting correlation without direct communication • Optimal rate region: Slepian-Wolf 1973 • X and Y => correlated sources X Encoder X,Y Decoder Y Encoder

  15. Distributed source coding (Contd) Rate region : Random Partitions of typical sets 7 6 5 4 3

  16. 00 10 01 11 00 11 01 10 00 01 10 11 00 01 10 11 0/1 0/1 0/1 Idea # 2 (contd): Is there anytelltale signs of symmetric overcomplete partitioning in (3,2,2) MDS codes

  17. Distortion 0 1 k n (# of Packets) Instead of a single codebook, build 3 different codebooks (quantizers) and then partition (overcomplete) them Idea #2 (Contd):

  18. Encoder 1 Encoder 2 Encoder n Problem Formulation (n,k) source-channel erasure code 1 2 ^ X Packet Erasure Channel X Decoder n • Decoder starts reconstruction with m> k packets • Rate of transmission of every packet = same • Distortion => only a function of # of received packets • Symmetric formulation, n >2

  19. Problem Formulation : Notation • Source X ~ q(x), Alphabet , Blocklength=L • Bounded distortion measure • Encoder: • Decoder • Distortion with h packets =

  20. Main Result achievable if for some p.m.f. and a set of functions such that Problem Statement (Contd.) What is the best distortion tuple for a rate of R bits/sample/packet?

  21. L 2 Example: (3,2) Code • (3,2) code: (Yi) have same p.d.f. • 3 codebooks each of rate I(X;Yi) are constructed randomly. • Each is partitioned into exp2(LR) bins and • # of codewords in a bin is exponential in { --.I(Y1;Y2)} • Thus 2R= I(X;Y1)+ I(X;Y2) - I(Y1;Y2)

  22. Example of a Gaussian Source : (3,2,2) code Distortion 1 bit/sample/packet

  23. (3,1) (3,2) Y21 Packet 1 Packet 2 (3,3) Y22 • Base Layer: Reception of any one packet => decode (3,1) code. • Middle Layer: (3,2) code => side information includes some one part of middle layer (source channel erasure codes!). Also includes some two parts of base layer. • Final Layer: (3,3) code => refine everything. • Every part of bitstream • contributes to source reconstruction. Y11 Packet 3 Y3 Y23 Y12 Y3 Y13 Y3 n-Channel Symmetric MD: Concatenation of (n,1), (n,2)…(n,n) source-channel erasure codes Idea # 3

  24. Key Concepts: • Multiple quantizers which can introduce correlated quantization noise: • MD Lattice VQ (Vaishampayan, Sloane, Diggavi ’01) • Computationally efficient multiple binning schemes: Symmetric distributed • source coding using coset codes (Pradhan-Ramchandran ’00, • Schonberg, Pradhan, Ramchadran ‘03) • Note: different from single binning schemes: • (Zamir-Shamai ’98, Pradhan-Ramchandran ’99)

  25. Y1, Y2, Y3 are correlated quantized versions of source X. • d(Yi, Yj) 2. A (3,2) Source Channel Lattice Code

  26. A (3,2) Source Channel Lattice Code • Code of distance • 5 overcomes • correlation noise • of 2.

  27. A (3,2) Source Channel Lattice Code

  28. A (3,2) Source Channel Lattice Code • Partitioning through cosets: constructive counterpart of “random bins”.

  29. A (3,2) Source Channel Lattice Code 1 2 Suppose 2 observations Y1 and Y2. Asymmetric case Y2 available at decoder. A code that combats correlation noise ensures decoding.

  30. A (3,2) Source Channel Lattice Code Suppose 2 observations Y1 and Y2. Symmetric case: Split the generator vectors of the code. 1 gets rows, 2 gets columns

  31. A (3,2) Source Channel Lattice Code 1 2 Suppose 2 observations Y1 and Y2. Symmetric case: Split the generator vectors of the code. 1 gets rows, 2 gets columns.

  32. A (3,2) Source Channel Lattice Code • 1,2,3 are independently quantized versions of source X. • d(Yi, Yj) 2.

  33. A (3,2) Source Channel Lattice Code • Find 3 generator vectors such that any two generate the code. • 1 gets rows, 2 gets columns, 3 gets diagonal.

  34. A (3,2) Source Channel Lattice Code 1 2 3 • Find 3 generator vectors such that any two are linearly independent. • 1 gets rows, 2 gets columns, 3 gets diagonal.

  35. Constructions for general n and k • Choose a code (generator matrix G) that combats correlation • noise. e.g., • Split the rows of G into k submatrices (k generator sets S1, …. Sk). e.g., • G1 = [5 0] and G2 = [0 5]. • Need a way to generate n generator sets out k such that anyk • of them are equivalent to G. • Choose generator matrix M (dim. k x n) of an (n,k) MDS block • code. Has the property that any k columns are independent. e.g.,

  36. Constructions for general n and k • Using weights from n columns one at a time, linearly combine k • generator sets (S1, …., Sk) to come up with n encoding matrices. e.g, • G1 = [5 0], G2 = [0 5], G3 = [5 5]. • Efficient algorithms for encoding and decoding using coset code • framework (Forney 1991).

  37. Conclusions • New rate region for n-channel MD problem • A new connection between MD problem and distributed source coding problem • A new application of multiple binning schemes • Construction based on coset codes • A nice synergy between quantization and MDS erasure codes

More Related