Slide1 l.jpg
This presentation is the property of its rightful owner.
Sponsored Links
1 / 47

TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A A A A A A A A A A A A PowerPoint PPT Presentation


  • 90 Views
  • Uploaded on
  • Presentation posted in: General

Workshop on Random Matrix Theory and Wireless Communications. Bridging the Gaps: Free Probability and Channel Capacity. Antonia Tulino Università degli Studi di Napoli Chautauqua Park, Boulder, Colorado, July 17, 2008. TexPoint fonts used in EMF.

Download Presentation

TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A A A A A A A A A A A A

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Slide1 l.jpg

Workshop on Random Matrix Theory and Wireless Communications

Bridging the Gaps:Free Probability and Channel Capacity

  • Antonia Tulino

  • Università degli Studi di Napoli

  • Chautauqua Park, Boulder, Colorado,July 17, 2008

  • TexPoint fonts used in EMF.

    Read the TexPoint manual before you delete this box.: AAAAAAAAAAAAAA


    Linear vector channel l.jpg

    Linear Vector Channel

    noise=AWGN+interference

    N-dimensional

    output

    K-dimensional

    input

    (NK)channel matrix

    • Variety of communication problems by simply reinterpreting K, N, and H

    • Fading

    • Wideband

    • Multiuser

    • Multiantenna


    Role of the singular values l.jpg

    Role of the Singular Values

    Mutual Information:

    Ergodic case:

    Non-Ergodic case:


    Role of the singular values4 l.jpg

    Role of the Singular Values

    Minumum Mean-Square Error (MMSE) :


    H model l.jpg

    • Independent and Identically distributed entries

    • Separable Correlation Model

    • UIU-Model with independent arbitrary distrbuted entries

    • with which is uniformly distributed over the manifold of complex matrices such that

    H-Model


    Flat fading deterministic isi l.jpg

    Gaussian Erasure Channels

    Flat Fading & Deterministic ISI:

    Random erasure mechanisms:

    • link congestion/failure (networks)

    • cellular system with unreliable wired infrastructure

    • impulse noise (DSL)

    • faulty transducers (sensor networks)


    D fold vandermonde matrix l.jpg

    i.i.d. with uniform

    distribution in [0, 1]d,

    d-Fold Vandermonde Matrix

    • Sensor networks

    • Multiantenna multiuser communications

    • Detection of distributed Targets


    Flat fading deterministic isi8 l.jpg

    an i.i.d sequence

    Flat Fading & Deterministic ISI:


    Formulation l.jpg

    Formulation

    = asymptotically circulant matrix (stationary input (with PSD )

    = asymptotically circulant matrix

    Grenander-Szego

    theorem

    Eigenvalues ofS


    Deterministic isi a i 1 l.jpg

    the water levelis chosen so that:

    where is the waterfilling input power spectral density given by:

    Deterministic ISI & |Ai|=1

    Key Tool: Grenander-Szego theorem on the distribution of the eigenvalues of

    large Toeplitz matrices


    Deterministic isi flat fading l.jpg

    Deterministic ISI & Flat Fading

    Key Question: The distribution of the eigenvalues of a large-dimensional random matrix:

    • S = asymptotically circulant matrix

    • A = random diagonal fading matrix


    A i e i 0 1 l.jpg

    Ai =ei ={0,1}

    Key Question: The distribution of the eigenvalues of a large-dimensional random matrix:

    • S = asymptotically circulant matrix

    • E = random 0-1 diagonal matrix


    Random matrix theory shannon transform l.jpg

    with X a nonnegative random variable whose distribution is while g is a nonnegative real number.

    RANDOM MATRIX THEORY:- & Shannon-Transform

    The - and Shannon-transform of an nonnegative definite random matrix , with asymptotic ESD

    A. M. Tulino and S. Verdú“Random Matrices and Wireless Communications,”Foundations and Trends in Communications and Information Theory, vol. 1, no. 1, June 2004.


    Random matrix theory shannon transform14 l.jpg

    Theorem:

    The Shannon transform and -transforms are related through:

    where is defined by the fixed-point equation

    RANDOM MATRIX THEORY:Shannon-Transform

    • Abe a nonnegative definite random matrix.


    Property of l.jpg

    • is monotonically increasing with g

    • which is the solution to the equation

    • is monotonically decreasing with y

    Property of


    Theorem transform l.jpg

    Theorem: -Transform

    Theorem:

    The -transform of is

    where is the solution to:


    Theorem shannon transform l.jpg

    Theorem: Shannon-Transform

    Theorem:

    The Shannon-transform of is

    where a and n are the solutions to:


    Slide19 l.jpg

    Stationary Gaussian inputs

    with power spectral

    Flat Fading & Deterministic ISI:

    Theorem:

    The mutual information is:

    with


    Slide20 l.jpg

    Stationary Gaussian inputs

    with power spectral

    Flat Fading & Deterministic ISI:

    Theorem:

    The mutual information is:

    with


    Slide21 l.jpg

    Stationary Gaussian inputs

    with power spectral

    Flat Fading & Deterministic ISI:

    Theorem:

    The mutual information is:

    with


    Special case no fading l.jpg

    Special Case: No Fading


    Special case memoryless channels l.jpg

    Special Case: Memoryless Channels


    Slide24 l.jpg

    Stationary Gaussian inputs

    with power spectral

    Special case: Gaussian Erasure Channels

    Theorem:

    The mutual information is:

    with


    Slide25 l.jpg

    Stationary Gaussian inputs

    with power spectral

    with :

    Flat Fading & Deterministic ISI:

    Let

    Theorem:

    The mutual information is:


    Example n 200 l.jpg

    n = 200

    Example n=200


    Example n 1000 l.jpg

    n = 1000

    Example n=1000


    Input optimization l.jpg

    with so that:

    Input Optimization

    Theorem:

    • be an random matrix

    • such that

    i-th column of

    A. M. Tulino, A. Lozano and S. Verdú“Capacity-Achieving Input Covariance for Single-user Multi-Antenna Channels”,IEEE Trans. on Wireless Communications 2006


    Input optimization29 l.jpg

    Input Optimization

    Theorem:

    The capacity-achieving input power spectral density is:

    where

    and is chosen so that


    Input optimization30 l.jpg

    the waterfilling solutionfor g

    the fading-free water level for g

    Input Optimization

    Corollary:

    Effect of fading on the capacity-achieving input power spectral density = SNR penalty

    with

    k< 1regulates amount of water admitted

    on each frequency tailoring the waterfilling

    for no-fading to fading channels.


    Theorem transform31 l.jpg

    Theorem: -Transform

    Theorem:

    The -transform of is

    where is the solution to:


    Proof key ingredient l.jpg

    Proof: Key Ingredient

    • We can replace S by it circulant asymptotic equivalent counterpart, =FLF†

    • Let Q = EF, denote by qi the ith column of Q, and let


    Proof l.jpg

    Proof:

    Matrix inversion lemma:


    Proof34 l.jpg

    Proof:


    Proof35 l.jpg

    Proof:

    Lemma:


    Asymptotics l.jpg

    Asymptotics

    • Low-power ( )

    • High-power ( )


    Asymptotics high snr l.jpg

    Asymptotics: High-SNR

    At large SNR we can closely approximate it linearly need and S0

    where

    High-SNR dB offset

    High-SNR slope


    Asymptotics high snr38 l.jpg

    Asymptotics: High-SNR

    Theorem:

    Let ,

    and the generalized bandwidth,


    Asymptotics39 l.jpg

    Asymptotics

    • Sporadic Erasure (e!0)

    • Sporadic Non-Erasure (e!1)


    Asymptotics sporadic erasures e 0 l.jpg

    Memoryless noisy erasure channel

    High SNR

    where is the water level of the PSD that achieves

    Low SNR

    Asymptotics: Sporadic Erasures (e0)

    Theorem:

    For any output power spectral densityand

    Theorem:

    For sporadic erasures:


    Asymptotics sporadic non erasures e 1 l.jpg

    Theorem:

    Optimizingover with

    with the maximum channel gain

    Asymptotics: Sporadic Non-Erasures (e1)

    Theorem:


    Bounds l.jpg

    S(f) =1

    Bounds:

    Theorem:

    The mutual information rate is lower bounded by:

    Equality


    Bounds43 l.jpg

    Bounds:

    Theorem:

    The mutual information rate is upper bounded by:


    D fold vandermonde matrix44 l.jpg

    d-Fold Vandermonde Matrix

    Diagonal matrix

    (either random or deterministic)

    with supported compact measure

    Diagonal matrix

    (either random or deterministic)

    with supported compact measure


    D fold vandermonde matrix45 l.jpg

    d-Fold Vandermonde Matrix

    Theorem:

    The -transform of is

    The Shannon-Transformr is


    D fold vandermonde matrix46 l.jpg

    d-Fold Vandermonde Matrix

    Theorem:

    The p-moment of is:


    Summary l.jpg

    Summary

    • Asymptotic distribution of A S A ---new result at the intersection of the asymptotic eigenvalue distribution of Toeplitz matrices and of random matrices ---

    • The mutual information of a Channel with ISI and Fading.

    • Optimality of waterfilling in the  presence of fading known at the receiver.

    • Easily computable asymptotic expressions in various regimes (low and high SNR)

    • New result for d-fold Vandermond matrices and on their product with diagonal matrices


  • Login