1 / 58

Distributed Joint Source-Channel Coding on a Multiple Access Channel with Side Information

Distributed Joint Source-Channel Coding on a Multiple Access Channel with Side Information. Vinod Sharma. Outline. Model General Theorem Special cases Example GMAC Discrete Sources Gaussian Sources Orthogonal channels Fading Channel Hierarchical Networks Conclusions. X 1n. U 1n.

kele
Download Presentation

Distributed Joint Source-Channel Coding on a Multiple Access Channel with Side Information

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Distributed Joint Source-Channel Coding on a Multiple Access Channelwith Side Information Vinod Sharma

  2. Outline • Model • General Theorem • Special cases • Example • GMAC • Discrete Sources • Gaussian Sources • Orthogonal channels • Fading Channel • Hierarchical Networks • Conclusions

  3. X1n U1n Enc 1 Yn Z1n Dec MAC X2n Enc2 U2n Zn Z2n Joint Source-Channel Coding on a Multiple Access Channel

  4. Joint Source-channel coding on a Multiple access channel (Contd) • Uin, n ≥ 1 iid data generated by user i. • (U1n, U2n) correlated. • Zin : side information at the encoder i. • Zn : side information at the decoder. • Xin: channel input from i th encoder at time n. • Yn: channel output at time n. • MAC is Memoryless : • Aim: To transmit data on MAC s.t. the decoder can decode within given distortion. • Source-channel separation does not hold for this system. Therefore joint source-channel coding is needed.

  5. Fusion node Cluster head Side information Sensor nodes Cluster Multiple Access channel Transmission of data from Sensor nodes to Fusion Node

  6. Transmission of data from Sensor nodes to Fusion Node • Sensor nodes transmit data to their cluster heads. • Cluster heads are more powerful nodes. • Cluster heads transmit data directly to the fusion node. • Within a cluster it becomes a Multiple Access channel. • Due to proximity of sensor nodes, data generated by them is correlated. • MAC is a building block for general sensor networks also.

  7. Theorem(R Rajesh, V K Varshneya, V Sharma “Distributed joint source channel coding on a multiple access channel with side information” ISIT 08) A source can be transmitted over the multiple access channel with distortions (D1, D2) if there exist random variables (W1, W2, X1, X2) such that 1) there exists a function 2) fD : W1 x W2 x Z → (Û1, Û2) such that

  8. Theorem 3) • I (U1, Z1; W1 | W2 , Z) < I (X1; Y | X2 , W2,Z) • I (U2, Z2; W2 | W1,Z) < I (X2; Y | X1, W1, Z) • I (U1, U2 , Z1, Z2; W1, W2 | Z) < I (X1, X2; Y | Z ) Where W1 is the set in which Wi takes values. and the constraints

  9. Theorem (Contd) Proof of theorem uses : • Vector quantization. • Joint source-channel coding. • Decoding by joint weak typicality. • Estimation of the sources Comments : Correlations in (U1, U2) help in reducing LHS and increasing RHS

  10. Generalizations The above theorem can be generalized to • Multiple user case. • Recover functions of the sources. • Discrete /Continuous source/channel alphabets. Includes the practically important Gaussian MAC. • Unbounded distortion measures. Includes Mean Square Error (MSE).

  11. Special Cases • Lossless MAC with correlated Sources • (Z1, Z2, Z ) independent of (U1,U2), Wi = Ui , i = 1, 2 • Then our result reduces to: • H(U1|U2) < I(X1;Y|X2,U2), • H(U2|U1) < I(X2;Y|X1,U1), • H(U1,U2) < I(X1,X2;Y). • Recovers Cover, El. Gamel, Salehi (1980).

  12. Special Cases 2. Lossy Multiple Access communication: Take (Z1, Z2, Z) independent of (U1, U2,W1,W2) Then our result reduces to I (U1 ;W1 | W2) < I (X1;Y | X2, W2), I (U2; W2| W1) < I (X2;Y |X1, W1), I (U1, U2 ;W1, W2) < I (X1, X2;Y). Generalizes Cover, El. Gamel, Salehi (1980) to lossy case.

  13. Special Cases 3. Lossy distributed source coding with side information: MAC taken as a dummy channel with Y=(X1,X2) R1 > I (U1, Z1; W1 | W2, Z) , R2 > I (U2, Z2; W2 | W1, ,Z) , R1 + R2 > I (U1, U2 , Z1, Z2; W1, W2 | Z) . Generalizes Slepian-Wolf (1973), Wyner and Ziv (1976), Gastpar (2004). • 4. Lossy GMAC • Y = X1 + X2 + N • Generalizes Lapidoth and Tinguely (2006)

  14. Special Cases 5. Compound MAC and Interference channel with side Information Two decoders- Decoder i has access to Yiand Zi Take Ui=Wi , i=1, 2. Applying Theorem twice for receiver only side information case for i=1,2 • H(U1 | U2 ,Zi) < I (X1;Yi | X2, U2 , Zi), • H(U2 | U1 , Zi) < I (X2;Yi | X1, U1 , Zi), • H(U1, U2 |Zi) < I (X1, X2;Yi|Zi). This gives the sufficient conditions for interference channels for the strong interference case. Recovers the results of D. Gunduz and E Erkip (ISIT 07)

  15. Special Cases Also recovers the results: • Lossless transmission over a MAC with receiver side information – D Gunduz, E Erkip UCSD ITA Workshop 07 • Mixed side information- M Fleming, M Effros IEEE TIT 06 • Lossless Multiple access communication with common information – Slepian and Wolf - 1973 • Correlated sources over orthogonal channels- J Barros and S Servetto- ISIT 03

  16. Example 1 (U1,U2) have joint distribution p(0, 0) = p(1, 1) =1/3 p(1, 0) = p(0, 1) =1/6 H(U1) = H(U2) = 1. For lossless transmission Using independent coding we need rates R1 ≥ H(U1),R2 ≥ H(U2) Exploiting correlations using Slepian-Wolf coding we need R1 ≥ H(U1|U2) = 0.918,R2 ≥ 0.918 and R1 + R2 ≥ H(U1,U2) = 1.918.

  17. Example(contd) MAC : Y = X1 + X2X1,X2 take values in {0, 1} and Y in {0, 1, 2} . This MAC does not satisfy source channel separation conditions. • Sum capacity of this channel with independent X1,X2 = 1.5 bits/symbol interval. • Even Slepian-Wolf coding will not provide lossless transmission.

  18. Example (contd) Joint source-channel coding : • Take Xi= Ui, i = 1, 2 • Then channel can transmit at rate 1.585. • Still we cannot transmit losslessly. With Distortion: • Consider Hamming distance • Allowable distortion = 4% • Then we need R1 ≥ H(p) - H(d) = 0.758 • Thus with independent coding we will not get this distortion but with correlated (X1,X2) it is possible.

  19. Example (contd) With side conditions Let Z1 obtained from U2 via a binary symmetric channel with cross over prob. 0.3. Similarly for Z2. Z= (Z1, Z2 ,V) Let V = U1.U2.N , with N independent of U1,U2 and P[N = 0] = P[N = 1] = 0.5 The sum rate needed for lossless transmission ≥ 1.8 bits with Z1 only. ≥ 1.683 bits with Z1 and Z2. ≥ 1.606 bits with Z only. ≥ 1.412 with Z1, Z2, Z. Thus with Z1, Z2, Z we can transmit losslessly with independent coding.

  20. Gaussian MAC (GMAC) Discrete Alphabet sources over a GMAC Y = X1 + X2 + N, N  (X1, X2), N ~ N(0,2N ) Initially we consider no side information. Power constraint: For lossless transmission, we need H (U1|U2 ) < I (X1;Y|X2,U2) H (U2|U1 ) < I (X2;Y|X1,U1) (1) H (U1,U2 ) < I (X1,X2;Y) Where {X1 , U1 , U2 , X2} forms a Markov chain. Comment: Above inequalities are not explicit enough. We make them more explicit

  21. Lemma H (U1|U2 ) < I (X1;Y|X2,U2) < I (X1;Y|X2) H (U2|U1 ) < I (X2;Y|X1,U1) < I (X2;Y|X1) (1a) H (U1,U2 ) < I (X1,X2;Y) Lemma 1: I(X1;Y|X2), I(X2;Y|X1) and I(X1,X2;Y) are maximized by jointly Gaussian (X1,X2) with mean zero and correlation .

  22. Comments • Conditions obtained by relaxing the R.H.S in (1) are more explicit. • Use to obtain efficient coding schemes. • For coding schemes check if (1) is satisfied. • Otherwise change  appropriately. We develop a distributed coding scheme that maps Ui to Xi such that (X1,X2) are jointly Gaussian with a given correlation

  23. Lemma Lemma 2 If X1 - U1 - U2 - X2 and (X1,X2) are jointly Gaussian then Lemma 3 Any jointly Gaussian two dimensional density can be arbitrarily closely approximated by a weighted sum of product of marginal Gaussian densities:

  24. A joint source-channel coding scheme(R Rajesh, V Sharma “A joint source-channel coding scheme for transmission of discrete correlated sources over a Gaussian MAC ”, ISITA 08) • pi, qiin Lemma 3 can be –ve. • Thus for approximating f (x1, x2) = desired jointly Gaussian density with correlation  • Find g from Lemma 3 that minimizes with pi ≥ 0, qi ≥ 0   pi.qi= 1 and mean zero.

  25. Discrete Alphabet sources over a GMAC( contd.) Thus taking (X1,X2) jointly Gaussian with mean zero, correlation  and var (Xi) = Piprovides conditions for lossless transmission: (2)

  26. Example 2 (U1,U2) has distribution : p(0, 0) = p(1, 1) = p(0, 1) =1/3, p(1, 0) = 0 Power constraints : P1 = 3, P2 = 4, 2N = 1, H(U1,U2) = 1.585 • For independent (X1,X2), RHS in the third inequality in (2) is 1.5. • Thus, (U1,U2) cannot be transmitted on the GMAC with independent (X1,X2). •   [0.144, 0.7024] will satisfy all three constraints.

  27. Example 2 (contd) For = 0.3 and 0.6 we obtained the density from the above optimization problem. The upper bound from Lemma 2 is 0.546.

  28. Example 2 (contd)

  29. GAUSSIAN SOURCES OVER A GAUSSIAN MAC(R Rajesh and V Sharma “Source-channel coding for Gaussian sources over a Gaussian multiple access channel”, Allerton 07) • (U1,U2) mean zero, Jointly Gaussian and covariance Y = X1 + X2 + N N ~ N(0,2N ) independent of X1,X2. • Pi is average power constraint for user i. • There is no side information. • Particularly relevant model for change of detection problem in sensor networks. • Source-channel separation does not hold.

  30. GAUSSIAN SOURCES OVER A GMAC We consider joint source-channel coding: (i) Amplify and forward (AF) (ii) Separation-Based (SB) (iii) Lapidoth-Tinguely (LT)

  31. U1 X1 Scale Y Dec GMAC U2 X2 Scale Amplify and Forward scheme (AF) Xiare scaled Ui s.t the average power constraints Pi , i= 1, 2 are satisfied. Then distortions are For two user symmetric case it is optimal for SNR ≤ /(1- 2) (Lapidoth and Tinguely (2006)).

  32. Separation Based approach(SB) • Vector quantization followed by Slepian-Wolf coding • Rate region for given (D1, D2 ) given in Viswanath (IEEE TIT 08) • X1 , X2 independent • Capacity of GMAC • R1 ≤ I (X1,Y|X2) = 0.5log(1 +P1/ N2) • R2 ≤ I (X2,Y|X1) = 0.5log(1 + P2/ N2) • R1 + R2 ≤ I (X1,X2;Y) = 0.5log(1 + (P1+ P2) / N2)

  33. Joint Source Channel coding(of Lapidoth and Tinguely) • Vector quantize the sources i=1,2 • Generate 2nRi iid Gaussian codewords with variance 1. • Code Uinby mapping to nearest code word • Scale codeword i to average power Pi • Correlation between the codewords

  34. Joint Source Channel coding (LT) If R1, R2 satisfy Then we obtain, Recovers the result in Lapidoth(2006)

  35. Comparison of the three schemesUi~N(0,1), i=1,2 , =0.1

  36. SNR Vs Distortion performance =0.75

  37. Conclusions from the above plots • AF is close to necessary conditions and hence optimal at low SNR. The other two schemes perform worse at low SNR. • SB and LT schemes perform better than AF at high SNRs. • LT scheme performs better than SB scheme. • Performance of SB and LT are close to each other for • low  for all SNR & • for high  at low SNR • For asymmetric case, for AF, investing all Piis suboptimal - we have developed optimal power allocation schemes.

  38. Conclusions with Side information(R Rajesh and V Sharma “Joint Source-channel coding for Gaussian sources over a Gaussian multiple access channel with side information”, NCC 09) • Decoder only side information is much more useful than encoder only side information. • The reduction in distortion is proportional to the side information quality. • AF is optimal at low SNR with or without side information. • Distortions in AF do not go to zero, when channel SNR is increased, with or without side information. • LT is always better than SB in the no side information case. But with side information SB is sometimes better than LT. • In the asymmetric case, when the difference between powers are large, encoder side information is more useful at lower  and at higher side channel SNR.

  39. Transmission of correlated sources over orthogonal MAC(R Rajesh and V Sharma “Correlated Gaussian sources over orthogonal Gaussian channels”, ISITA 08) • Y= (Y1, Y2) • P(y1, y2| x1, x2) = P(y1| x1) P( y2| x2) • Source channel separation holds even with side information for lossless transmission ( We have derived the exact region). • Source coding of (U1,U2) via Slepian-Wolf (vector quantize first in case of continuous sources) • Optimal signaling is by independent (X1,X2) which does not depend on the sources (U1,U2).

  40. Correlated Gaussian sources over orthogonal GMAC • (U1,U2) zero mean correlation  . • Var( Ui) = i2 • Yi = Xi + Ni • Ni independent of Xi • (N1,N2) zero mean independent • Optimal scheme is SB. Send X1 independent of X2 AF performs close to the optimal scheme

  41. Comparison of AF and SB SNR vs distortion performance  =0.7

  42. Comparison of AF and SB SNR vs distortion performance  =0.3

  43. MAC with Fading (R Rajesh and V Sharma “Transmission of correlated sources over a fading Multiple Access Channel”, Allerton 08) • {hin, n ≥ 1} iid fading for sensor i. Known at transmitter and receiver. • Aim: To transmit data on the Fading MAC s.t. the decoder can decodewithin given distortion. Theorem The following conditions should hold 1) 2) (3)

  44. Gaussian Fading MAC • Yn =H1n X1n + H2nX2n + Nn • N ~ N(0, ) independent of X1, X2. • Piis average power constraint for user i. • Distortion Measure - MSE • Source-channel separation does not hold.

  45. Gaussian Fading MAC If (X1,X2) has correlation then we get

  46. Power Allocation • Maximize the RHS in the third inequality and such that it satisfies all the conditions • Compare with – RTDMA, MRTDMA , UPA

  47. Generalizations • Partial Channel state information at the transmitter and partial state information at the decoder. Partial CSIT, Perfect CSIR

  48. Special Cases • Lossless transmission of independent sources with partial CSIT, perfect CSIR - G. Keshet, Y. Steinberg, and N. Merhav (NOW publications) • Transmission of independent sources with Perfect CSIT, no CSIR - S. Sigurjonsson and Y. H. Kim (ISIT 05) • Transmission of independent sources and Gaussian MAC with Partial CSIT, perfect CSIR, lossless transmission – M Mecking ( ISIT 02)

  49. Hierarchical Network(R Rajesh and V Sharma “Amplify and Forward for correlated data gathering over hierarchical sensor networks”, WCNC 09 ( To appear)) • MAC with side information as the building block • Identify the network components- Sensor Nodes and Relay Nodes • Multiple user behaviour of AF and SB-TDMA AF – Multiple users

  50. AF vs SB-TDMA Comparison of AF and SB-TDMA ( Slepian-Wolf kind of source coding and then transmission through TDMA links) AF performs well for both Sensor nodes and Relay nodes as number of nodes increase!!!

More Related