Create Presentation
Download Presentation

Download Presentation
## Andrea Goldsmith Stanford University

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -

**Capacity Limits of Wireless Channels with Multiple**Antennas:Challenges, Insights, and New Mathematical Methods Andrea Goldsmith Stanford University CoAuthors: T. Holliday, S. Jafar, N. Jindal, S. Vishwanath Princeton-Rutgers Seminar Series Rutgers University April 23, 2003**Future Wireless Systems**Ubiquitous Communication Among People and Devices Nth Generation Cellular Nth Generation WLANs Wireless Entertainment Wireless Ad Hoc Networks Sensor Networks Smart Homes/Appliances Automated Cars/Factories Telemedicine/Learning All this and more…**Challenges**• The wireless channel is a randomly-varying broadcast medium with limited bandwidth. • Fundamental capacity limits and good protocol designs for wireless networks are open problems. • Hard energy and delay constraints change fundamental design principles • Many applications fail miserably with a “generic” network approach: need for crosslayer design**Outline**• Wireless Channel Capacity • Capacity of MIMO Channels • Imperfect channel information • Channel correlations • Multiuser MIMO Channels • Duality and Dirty Paper Coding • Lyapunov Exponents and Capacity**Wireless Channel CapacityFundamental Limit on Data Rates**Capacity: The set of simultaneously achievable rates {R1,…,Rn} • Main drivers of channel capacity • Bandwidth and power • Statistics of the channel • Channel knowledge and how it is used • Number of antennas at TX and RX R3 R2 R3 R2 R1 R1**MIMO Channel Model**n TX antennas m RX antennas h11 x1 y1 h12 h21 h31 h22 x2 y2 h32 h13 h23 h33 x3 y3 Model applies to any channel described by a matrix (e.g. ISI channels)**What’s so great about MIMO?**• Fantastic capacity gains (Foschini/Gans’96, Telatar’99) • Capacity grows linearly with antennas when channel known perfectly at Tx and Rx • Vector codes (or scalar codes with SIC) optimal • Assumptions: • Perfect channel knowledge • Spatially uncorrelated fading: Rank (HTQH)=min(n,m) What happens when these assumptions are relaxed?**Realistic Assumptions**• No transmitter knowledge of H • Capacity is much smaller • No receiver knowledge of H • Capacity does not increase as the number of antennas increases (Marzetta/Hochwald’99) • Will the promise of MIMO be realized in practice?**Partial Channel Knowledge**Channel • Model channel as H~N(m,S) • Receiver knows channel H perfectly • Transmitter has partial information q about H Transmitter Receiver**Partial Information Models**• Channel mean information • Mean is measured, Covariance unknown • Channel covariance information • Mean unknown, measure covariance • We have developed necessary and sufficient conditions for the optimality of beamforming • Obtained for both MISO and MIMO channels • Optimal transmission strategy also known**Beamforming**• Scalar codes with transmit precoding Receiver • Transforms the MIMO system into a SISO system. • Greatly simplifies encoding and decoding. • Channel indicates the best direction to beamform • Need “sufficient” knowledge for optimality**No Tx or Rx Knowledge**• Increasing nT beyond coherence time aT in a block fading channel does not increase capacity (Marzetta/Hochwald’99) • Assumes uncorrelated fading. • We have shown that with correlated fading, adding Tx antennas always increases capacity • Small transmit antenna spacing is good! • Impact of spatial correlations on channel capacity • Perfect Rx and Tx knowledge: hurts (Boche/Jorswieck’03) • Perfect Rx knowledge, no Tx knowledge: hurts (BJ’03) • Perfect Rx knowledge, Tx knows correlation: helps • TX and Rx only know correlation: helps**Broadcast (BC):**One Transmitter to Many Receivers. Multiple Access (MAC): Many Transmitters to One Receiver. x x x x h1(t) h21(t) h3(t) h22(t) Gaussian Broadcast and Multiple Access Channels • Transmit power constraint • Perfect Tx and Rx knowledge**Comparison of MAC and BC**P • Differences: • Shared vs. individual power constraints • Near-far effect in MAC • Similarities: • Optimal BC “superposition” coding is also optimal for MAC (sum of Gaussian codewords) • Both decoders exploit successive decoding and interference cancellation P1 P2**MAC-BC Capacity Regions**• MAC capacity region known for many cases • Convex optimization problem • BC capacity region typically only known for (parallel) degraded channels • Formulas often not convex • Can we find a connection between the BC and MAC capacity regions? Duality**Dual Broadcast and MAC Channels**Gaussian BC and MAC with same channel gains and same noise power at each receiver x + x + x x + Multiple-Access Channel (MAC) Broadcast Channel (BC)**P1=0.5, P2=1.5**P1=1.5, P2=0.5 The BC from the MAC P1=1, P2=1 Blue = BC Red = MAC MAC with sum-power constraint**Sum-Power MAC**• MAC with sum power constraint • Power pooled between MAC transmitters • No transmitter coordination Same capacity region! BC MAC**BC to MAC: Channel Scaling**• Scale channel gain by a, power by 1/a • MAC capacity region unaffected by scaling • Scaled MAC capacity region is a subset of the scaled BC capacity region for any a • MAC region inside scaled BC region for anyscaling MAC + + + BC**The BC from the MAC**Blue = Scaled BC Red = MAC**Duality: Constant AWGN Channels**• BC in terms of MAC • MAC in terms of BC What is the relationship between the optimal transmission strategies?**Transmission Strategy Transformations**• Equate rates, solve for powers • Opposite decoding order • Stronger user (User 1) decoded last in BC • Weaker user (User 2) decoded last in MAC**Duality Applies to DifferentFading Channel Capacities**• Ergodic (Shannon) capacity: maximum rate averaged over all fading states. • Zero-outage capacity: maximum rate that can be maintained in all fading states. • Outage capacity: maximum rate that can be maintained in all nonoutage fading states. • Minimum rate capacity: Minimum rate maintained in all states, maximize average rate in excess of minimum Explicit transformations between transmission strategies**Duality: Minimum Rate Capacity**MAC in terms of BC Blue = Scaled BC Red = MAC • BC region known • MAC region can only be obtained by duality What other unknown capacity regions can be obtained by duality?**Dirty Paper Coding (Costa’83)**• Basic premise • If the interference is known, channel capacity same as if there is no interference • Accomplished by cleverly distributing the writing (codewords) and coloring their ink • Decoder must know how to read these codewords Dirty Paper Coding Dirty Paper Coding Clean Channel Dirty Channel**-1**0 +1 X -1 0 +1 Modulo Encoding/Decoding • Received signal Y=X+S, -1X1 • S known to transmitter, not receiver • Modulo operation removes the interference effects • Set X so that Y[-1,1]=desired message (e.g. 0.5) • Receiver demodulates modulo [-1,1] … … -5 -3 -1 0 +1 +3 +5 +7 -7 S**Broadcast MIMO Channel**t1 TX antennas r11, r21 RX antennas Perfect CSI at TX and RX Non-degraded broadcast channel**Capacity Results**• Non-degraded broadcast channel • Receivers not necessarily “better” or “worse” due to multiple transmit/receive antennas • Capacity region for general case unknown • Pioneering work by Caire/Shamai (Allerton’00): • Two TX antennas/two RXs (1 antenna each) • Dirty paper coding/lattice precoding* • Computationally very complex • MIMO version of the Sato upper bound *Extended by Yu/Cioffi**Dirty-Paper Coding (DPC)for MIMO BC**• Coding scheme: • Choose a codeword for user 1 • Treat this codeword as interference to user 2 • Pick signal for User 2 using “pre-coding” • Receiver 2 experiences no interference: • Signal for Receiver 2 interferes with Receiver 1: • Encoding order can be switched**Does DPC achieve capacity?**• DPC yields MIMO BC achievable region. • We call this the dirty-paper region • Is this region the capacity region? • We use duality, dirty paper coding, and Sato’s upper bound to address this question**MIMO MAC with sum power**• MAC with sum power: • Transmitters code independently • Share power • Theorem: Dirty-paper BC region equals the dual sum-power MAC region P**Transformations: MAC to BC**• Show any rate achievable in sum-power MAC also achievable with DPC for BC: • A sum-power MAC strategy for point (R1,…RN) has a given input covariance matrix and encoding order • We find the corresponding PSD covariance matrix and encoding order to achieve (R1,…,RN) with DPC on BC • The rank-preserving transform “flips the effective channel” and reverses the order • Side result: beamforming is optimal for BC with 1 Rx antenna at each mobile DPC BC Sum MAC**Transformations: BC to MAC**• Show any rate achievable with DPC in BC also achievable in sum-power MAC: • We find transformation between optimal DPC strategy and optimal sum-power MAC strategy • “Flip the effective channel” and reverse order DPC BC Sum MAC**Computing the Capacity Region**• Hard to compute DPC region (Caire/Shamai’00) • “Easy” to compute the MIMO MAC capacity region • Obtain DPC region by solving for sum-power MAC and applying the theorem • Fast iterative algorithms have been developed • Greatly simplifies calculation of the DPC region and the associated transmit strategy**Sato Upper Bound on the**BC Capacity Region Based on receiver cooperation BC sum rate capacity Cooperative capacity + Joint receiver +**The Sato Bound for MIMO BC**• Introduce noise correlation between receivers • BC capacity region unaffected • Only depends on noise marginals • Tight Bound (Caire/Shamai’00) • Cooperative capacity with worst-casenoise correlation • Explicit formula for worst-case noise covariance • By Lagrangian duality, cooperative BC region equals the sum-rate capacity region of MIMO MAC**Sum-Rate Proof**DPC Achievable Duality Obvious Sato Bound *Same result by Vishwanath/Tse for 1 Rx antenna Lagrangian Duality Compute from MAC**MIMO BC Capacity Bounds**Single User Capacity Bounds Dirty Paper Achievable Region BC Sum Rate Point Sato Upper Bound Does the DPC region equal the capacity region?**Full Capacity Region**• DPC gives us an achievable region • Sato bound only touches at sum-rate point • We need a tighter bound to prove DPC is optimal**A Tighter Upper Bound**• Give data of one user to other users • Channel becomes a degraded BC • Capacity region for degraded BC known • Tight upper bound on original channel capacity • This bound and duality prove that DPC achieves capacity under a Gaussian input restriction • Remains to be shown that Gaussian inputs are optimal + +**Full Capacity Region Proof**Tight Upper Bound Final Result Duality Duality Compute from MAC Worst Case Noise Diagonalizes**Time-varying Channels with Memory**• Time-varying channels with finite memory induce infinite memory in the channel output. • Capacity for time-varying infinite memory channels is only known in terms of a limit • Closed-form capacity solutions only known in a few cases • Gilbert/Elliot and Finite State Markov Channels**A New Characterization of Channel Capacity**• Capacity using Lyapunov exponents • Similar definitions hold for l(Y) and l(X;Y) • Matrices BYi and BXiYidepend on input and channel where the Lyapunov exponent for BXi a random matrix whose entries depend on the input symbol Xi**Lyapunov Exponents and Entropy**• Lyapunov exponent equals entropy under certain conditions • Entropy as a product of random matrices • Connection between IT and dynamic systems theory • Still have a limiting expression for entropy • Sample entropy has poor convergence properties**Lyapunov Direction Vector**• The vector pn is the “direction” associated with l(X) for any m. • Also defines the conditional channel state probability • Vector has a number of interesting properties • It is the standard prediction filter in hidden Markov models • Under certain conditions we can use its stationary distribution to directly compute l(X) l(X)**Computing Lyapunov Exponents**• Define p as the stationary distribution of the “direction vector” pnpn • We prove that we can compute these Lyapunov exponents in closed form as • This result is a significant advance in the theory of Lyapunov exponent computation p pn+2 pn pn+1**Computing Capacity**• Closed-form formula for mutual information • We prove continuity of the Lyapunov exponents with respect to input distribution and channel • Can thus maximize mutual information relative to channel input distribution to get capacity • Numerical results for time-varying SISO and MIMO channel capacity have been obtained • We also develop a new CLT and confidence interval methodology for sample entropy