1 / 25

Cooperative Communication in Sensor Networks: Relay Channels with Correlated Sources

Cooperative Communication in Sensor Networks: Relay Channels with Correlated Sources. Brian Smith and Sriram Vishwanath University of Texas at Austin October 1, 2004 Allerton Conference on Communication, Control, and Computing. Overview. The Sensor Network Problem The Relay Channel

nona
Download Presentation

Cooperative Communication in Sensor Networks: Relay Channels with Correlated Sources

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Cooperative Communication in Sensor Networks: Relay Channels with Correlated Sources Brian Smith and Sriram Vishwanath University of Texas at Austin October 1, 2004 Allerton Conference on Communication, Control, and Computing

  2. Overview • The Sensor Network Problem • The Relay Channel • Block-Markov Coding • Correlated Side Information and the Relay Channel • Communicating Two Sources • Example: A MIMO Gaussian Relay Channel • Conclusion

  3. Terminal Node Sensor Nodes Example Relay Configurations Sensor Networks • Many distributed nodes can make measurements and cooperate to propagate information through the system, perhaps to a single endpoint • Key Observation: Nodes located near each other may have correlated information. What effect does this have on information flow?

  4. The Relay Channel • Introduced by van der Meulen • “Three-terminal communication channels,” Adv. Appl Prob., vol. 3, pp.120-54, 1971. • Discrete Memoryless Relay Channel consists of • An Input X. • A Relay Output Y1. • A Relay Sender X1, which can depend upon previously received Y1. • A Channel Output Y. • A conditional probability distribution p(y,y1|x,x1). Relay, Receives Y1; Inputs X1 Transmitter, Input X Receiver, Receives Y

  5. The Relay Channel in Sensor Networks • Multi-hop Strategies are thought to be “good” methods for conveying information through wireless networks • G. Kramer and M. Gastpar and P. Gupta,“Capacity Theorems for Wireless Relay Channels,” Proc. 41st Allerton Conf. on Comm., Control, and Computing, Oct. 1-3 2003 • P. Gupta and P. R. Kumar, “The Capacity of Wireless Networks,” IEEE Trans. Inform. Theory, vol. 46, no. 3, pp. 388-404. Mar. 2000 • A Relay Channel embedded in a sensor network differs from the previous definition in that both the sender and the relay have access to sources of information. • If those sources are correlated, can exploit the relay’s source as side information • In some instances, it is desirable to transmit the relay’s information to the receiver, as well as the transmitter’s

  6. The Relay Channel • Capacity of General Relay Channel is Unknown • Upper Bound by Cut-Set Argument • Capacity Known for special cases, i.e. Degraded Relay Channel • T.M. Cover and A.A. El Gamal, “Capacity theorems for the relay channel,” IEEE Trans. Information Theory, vol. IT-25, No. 5, pp 572-84, Sep 1979. • Is also a lower bound on capacity of the general relay channel • Often the best known lower bound • This Rate is Achieved by Block-Markov Coding • Introduce a correlation between the transmitter input and the relay input to aid in decoding at the receiver • The relay completely decodes the message meant for the receiver in the block-Markov scheme

  7. Block-Markov Coding • Overview of Block-Markov Coding for Classic Relay Channel • Relay Terminal completely decodes a message index w from the set {1..2nR} sent by the transmitter • Relay sends the bin index of the message that it received to aid the receiver in decoding • This is helpful, because the transmitter’s codeword is dependent on the bin index that the relay is transmitting in the same block • Codebook generation • Fix any p(x,x1) as • Generate 2nR0 codewords x1n as • Index them as x1n(s) • For each of these x1n codewords, generate 2nR xn codewords as • Index these as xn(w|s) • Independently bin the messages w into the 2nR0 bins s

  8. Block-Markov Coding • Encoding • Messages sent in a total of B blocks • In the first block • Relay sends codeword x1n corresponding to a pre-determined null message, say x1n(φ) • Transmitter sends codeword xn, dependant on the first message w1 and the null message, say xn(w1|φ) • In block b • Assuming relay correctly decoded message wb-1 sent in the previous block, relay sends codeword x1n(sb-1) • Transmitter sends xn(wb|sb-1) • Same bin index s that relay is simultaneously sending • Shifted by one block to allow relay to decode current message

  9. xn(w1|φ) xn(w2|s1) Block-Markov Encoding Block: b=1 b=2 b-1 b Message: w1 w2 wb-1 wb Transmitter: xn(wb|sb-1) Relay: x1n(φ) x1n(s1) x1n(sb-1) At the end of each block: Relay Decodes: w1 w2 wb-1 wb sb-2 sb-1 Receiver Decodes: s1 w1 wb-2 wb-1

  10. Block-Markov Coding Y1:X1 Source U • Decoding • At the Relay • Can determine wb correctly whp if R<I(X;Y1|X1) • At the Receiver • Can determine sb-1 correctly whp if R0<I(X1;Y) • Make a list of all messages wb for which the codewords xn(wb|sb-1) . x1n(sb-1), and yn are jointly typical, put the list aside until the end of the next block • In the block b+1, determine sb as above, and declare wb to be the message sent in block b if it is the only message on the list in the bin sb • Done correctly whp when R-R0 < I(X;Y|X1) • Intuitively – 2^(R-R0) messages in the bin, only one should be jointly typical • If both constraints on R are fulfilled, then the rate is achievable • Next, apply this to our sensor network (correlated) configuration X Y

  11. Source V Y1:X1 Correlation Y Source U X Relay Channel with Correlated Side Information • Key addition to the coding strategy: Slepian-Wolf coding • With no side information at the relay, the block-Markov strategy requires that all of H(U) be transmitted to the relay • However, if the relay has access to a source V correlated with U, only need to push H(U|V) information across the channel • Will show that this relay channel has achievable rate • Implies that if for the p* which maximizes I(X,X1;Y) the first term is greater than the second term, then we have found capacity

  12. Strategy for Relay Channel with Correlated Side Information • Codebook Generation: • Identical to Block-Markov • Encoding: • For source U, generate 2nR sequences Un and index them by w The rest of the encoding is identical to Block-Markov. • Decoding at the Relay: • Form two lists of message indices w • Those whose codewords xn(wb|sb-1) are jointly typical with the yb received and x1n(sb-1) • Those whose sequences Un are jointly typical with the Vn at the relay • Choose the unique index in that appears in both lists

  13. Strategy for Relay Channel with Correlated Side Information Y1:X1 Source U • Decoding at the Relay • Choose the unique index in that appears in both lists • Correct with high probability if • Probability that an incorrect codeword is jointly typical with yn • Probability that an incorrect sequence is jointly typical with Vn • Independent, both errors must occur, error probability is the product • Decoding at the Receiver • Identical to Block-Markov, leading to the same constraint X Y

  14. Summary for Relay Channel with Correlated Side Information • Achievable Rates • Not closed form; because I(U;V) and H(U) are related • If correlation is great enough, capacity is found • Specifically, if under the which maximizes the cooperative rate • MIMO example with two receive antennae • Strategy utilizes Joint Source-Channel Coding

  15. Relay Channel with Two Correlated Sources • Desire to send both the source U at the transmitter and the source V located at the relay • Multiple access channel where one source can send some information to the other • Achievable Rates such that • Interpretation: • In the previous case, the codeword x1n carried information about the bin index s, only. Now, it needs to also describe the sequence Vn

  16. Two Sources Block-Markov Coding • Overview • Have data sequences Un and Vn • As before, indices w in {1..2nRu} refer to typical Unsequences • Indices s in {1..2nR0} correspond to bins of Unsequences • New indices k in {1..2nR1} for bins of Vn sequences

  17. Two Sources Block-Markov Coding • Codebook generation • Fix • Generate 2nR0 sequences zn as • Index them as zn(s) • For each of these sequences zn , generate 2nRu codewords xn as • Index them as xn(w|s) • For each of the sequences zn , generate 2nR1 codewords x1n as • Index them as x1(k,s) • Independently bin the messages w into the 2nR0 bins s • Generate 2nRu sequences Un and 2nRv sequences Vn • Randomly place the Vn sequences into the 2nR1 bins k

  18. 2nRv Vn Sequences k=2nR1 k=1 k=2 k=3 2nR1 Bins Indexed by k Encoding Graphic 2nRu Un Sequences Indexed by w S=1 S=2 S=3 ~2nRu-R0 Un Sequences in Each s Bin S=2nR0 ~2nRv-R1 Vn Sequences in Each k Bin 2nR0 Bins Indexed by s Codewords for Each Block zn – depends on s xn(wb|sb-1) – depends on current w and previous s (through z) x1n(kb,sb-1) – depends on current k and previous s (through z)

  19. zn(sb-1), x1n(sb-1,kb) Source V Y1:X1 Correlation Y Source U X xn(wb,sb-1) Two Sources Block-Markov Decoding • At the Relay • Decodes wb if • At the Receiver • Decodes sb-1 if • Decodes wb-1 if • Decodes kb if • Chooses Vnb-1 if it is the only sequence in bin kb-1 jointly typical with Unb-1 • Correct choice if

  20. Two Sources Summary • Constraints taken together give the achievable rates • Codebook generation is not over arbitrary p(x,x1) • Cannot say that this is capacity • Correlation helps improve rate along two links in the channel • Independent compression and channel coding would limit the rate such that

  21. Example: MIMO Channel • Example where Capacity Found • Description of example MIMO System • Transmitter to Relay is Point-to-Point with gain h • Relay and Transmitter each have single antenna, power constraints P and P1 • Receiver has two antennae and matrix gain H • Independent Gaussian noise Model with unit noise power • Desire to transmit the single source U Y1=hX + η1 X Y=H[x,x1]T + [ηY1, ηY2]T

  22. Example: MIMO Channel • Achievable Rate: • Treat (X,X1) →Y as a MIMO Point-to-Point • Solve for covariance which maximizes • Given the covariance of the form p*(x,x1) is uniquely defined, calculate

  23. Example: MIMO Channel • Achievable Rate: • Knowing two of the mutual information terms tells how much correlation there must be between U and V for the cooperative rate to be capacity • Numerical Example: • Power constraints: P=2, P1=2 Y1=3X + η1 X Y=H[x,x1]T + [ηY1, ηY2]T

  24. Example: MIMO Channel • Maximizing correlation for input distribution: • Leads to H(U)=I(X,X1;Y)=2.569 and I(X;Y1|X1)=1.971 • So if I(U;V) > .598, then cooperative rate is capacity • Assume a model for correlation such that I(U;V) = βH(U) • β=0 for totally independent U and V • β=1 for U = V • For a β>.233, cooperative rate is capacity Y1=3X + η1 X Y=H[x,x1]T + [ηY1, ηY2]T

  25. Summary • Started with the observation that nodes in a sensor network may have correlated data • Showed that can use this side information to increase rate (or decrease power usage) • Jointly coding over both source and channel can be more powerful than doing each individually • For some relay channels with correlated sources, capacity can be found

More Related