1 / 16

Random coding for wireless multicast

Random coding for wireless multicast . Performance for fading channels and inter-session coding. Brooke Shrader and Anthony Ephremides University of Maryland Joint work with Randy Cogill, University of Virginia May 9, 2008.

lotta
Download Presentation

Random coding for wireless multicast

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Random coding for wireless multicast Performance for fading channels and inter-session coding Brooke Shrader and Anthony Ephremides University of Maryland Joint work with Randy Cogill, University of Virginia May 9, 2008

  2. Alternatives to overcome channel MAC-layer errors in multicast transmission: repeatedly send packets (ARQ) network coding Introduction and Motivation Previous work: • network coding outperforms ARQ for time-invariant channels • coding used within (but not between) multicast sessions

  3. For packets form where ai are generated randomly and uniformly from u-ary alphabet and Σ is sum in finite field. If ai = 0 for all i, generate new coefficients. Random linear coding for multicast Form random linear combinations of K packets. Transmit coefficients ai in packet header Decode: solve a system of linear equations in si

  4. Multicast throughput Let Tm denote the number of slots needed for destination m to collect K linearly independent random linear combinations. The multicast throughput is: Difficulty: Tm are correlated due to correlation in the random linear combinations sent to different destination nodes. This is true even if the channels to the destination nodes are independent.

  5. Lower bound on multicast throughput Assume: the channels to the M destinations are identically distributed (but not independent). For random variables X1,X2,…,XM identically distributed and correlated and for any t > 0, Then the multicast throughput is lower bounded, for any t > 0, as

  6. Random linear network coding naturally adapts coding rate to variations in the channel. Coding across sessions means that receivers decode additional packets that aren’t intended for them. Our contributions We use this bound to quantify the multicast throughput for random linear coding • over a fading wireless channel where reception probability depends on packet length, overhead, SNR • across multiple multicast sessions

  7. Network coding: can approach min-cut capacity in the limit as alphabet size approaches infinity. Random network coding:overhead needed to transmit coefficients of random code. Packet length (symbols per packet & alphabet size) must be sufficiently large in order to: approach min-cut capacity ensure small (fractional) overhead I: Packet length and overhead Our approach: model the packet erasure probability as a function of packet length (symbols per packet and alphabet size).

  8. I: Packet erasure probability Assume that there is no channel coding within packets. For packet to be received, every symbol must be received. q: Probability that a transmitted packet is successfully received at a destination node. Pu: u-ary symbol error probability for modulation scheme, depends on SNR, channel model (e.g., AWGN)

  9. I: Accounting for overhead Each packet consists of nu-ary symbols. Coding is performed on groups of K packets. The multicast throughput is lower bounded, for any t > 0, as ratio of information to information+overhead

  10. I: Fading channel model The channel to each destination node evolves as a Markov chain with “Good” and “Bad” states. qG: Probability that a transmitted packet is received in “Good” state qB: Probability that a transmitted packet is received in “Bad” state A packet-erasure version of the Gilbert channel model.

  11. I: Augmented Markov chain for reception at each destination State (S,j) where S is “Good” or “Bad” state and j=0,1,…K is the number of linearly independent random linear combinations that have been received. P: transition probability matrix Assume qB=0, so initial state is always S=G. Transmission time T1: time to reach state (S,K) from (G,0).

  12. I: Multicast throughput versus K Compare to time-invariant channel with probability of reception M=10, n=250, u=8 QAM modulation over AWGN channel with SNR/bit 3.5 dB in “Good” state and -∞ dB in “Bad” state.

  13. II: Coding across multicast sessions • One source node • K multicast sessions, each with an independent arrival process of equal rate • Each session serves M destination nodes • Channels to all MK destination nodes are identically distributed with reception probability q Random linear coding: create random linear combinations from the K head-of-line packets, one from each session.

  14. II: Coding across multicast sessions For successful decoding, each destination must decode the packets from all K multicast sessions. Using bound on E[max(X1,X2,…XMK)], we bound the throughput as

  15. II: Multicast throughput for coding across sessions For large number of sessions and receivers per session, coding outperforms retransmissions K=50, q=0.8

  16. Conclusions • We provided a lower bound on multicast throughput for random linear coding while accounting for packet length, overhead, SNR, and fading. • We demonstrated that random linear coding across multiple multicast sessions often outperforms ARQ. • Future work: • incorporate channel coding within packet, study how to allocate coding within and among packets • code over multiple packets from multiple flows

More Related