Wireless communication elec 534 set i september 9 2007
Download
1 / 44

Wireless Communication Elec 534 Set I September 9, 2007 - PowerPoint PPT Presentation


  • 102 Views
  • Uploaded on

Wireless Communication Elec 534 Set I September 9, 2007. Behnaam Aazhang. The Course. Light homework Team project Individual paper presentations Mid October Team project presentations Early December. Multiuser Network. Multiple nodes with information. Outline.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Wireless Communication Elec 534 Set I September 9, 2007' - Anita


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Wireless communication elec 534 set i september 9 2007

Wireless CommunicationElec 534Set ISeptember 9, 2007

Behnaam Aazhang


The course
The Course

  • Light homework

  • Team project

    • Individual paper presentations

      • Mid October

    • Team project presentations

      • Early December


Multiuser network
Multiuser Network

  • Multiple nodes with information


Outline
Outline

  • Transmission over simple channels

    • Information theoretic approach

    • Fundamental limits

    • Approaching capacity

  • Fading channel models

    • Multipath

    • Rayleigh

    • Rician


Outline1
Outline

  • Transmission over fading channels

    • Information theoretic approach

    • Fundamental limits

    • Approaching achievable rates

  • Communication with “additional” dimensions

    • Multiple input multiple (MIMO)

      • Achievable rates

      • Transmission techniques

    • User cooperation

      • Achievable rates

      • Transmission techniques


Outline2
Outline

  • Wireless network

    • Cellular radios

    • Multiple access

      • Achievable rate region

      • Multiuser detection

    • Random access


Why information theory
Why Information Theory?

  • Information is modeled as random

  • Information is quantified

  • Transmission of information

    • Model driven

    • Reliability measured

    • Rate is established


Information
Information

  • Entropy

    • Higher entropy (more random) higher information content

  • Random variable

    • Discrete

    • Continuous


Communication
Communication

  • Information transmission

  • Mutual information

Channel

Useful Information

Maximum

useful information

Noise; useless information


Wireless
Wireless

Interference

  • Information transmission

Channel

Useful Information

Maximum

useful information

Randomness

due to channel

Noise; useless information


Multiuser network1
Multiuser Network

  • Multiple nodes with information


References
References

  • C.E. Shannon, W. Weaver, A Mathematical Theory Communication, 1949.

  • T.M. Cover and J. Thomas, Elements of Information Theory, 1991.

  • R. Gallager, Information Theory and Reliable Communication, 1968.

  • J. Proakis, Digital Communication, 4th edition

  • D. Tse and P. Viswanath, Fundamentals of Wireless Communication, 2005.

  • A. Goldsmith “Wireless Communication” Cambridge University Press 2005


References1
References

  • E. Biglieri, J. Proakis, S. Shamai, Fading Channels: Information Theoretic and Communications, IEEE IT Trans.,1999.

  • A. Goldsmith, P. Varaiya, Capacity of Fading Channels with Channel Side Information, IEEE IT Trans. 1997.

  • I. Telatar, Capacity of Multi-antenna Gaussian Channels, European Trans. Telecomm, 1999.

  • A. Sendonaris, E. Erkip, and B. Aazhang, “User cooperation diversity, Part I. Systemdescription,” IEEE Trans. Commun.,Nov. 2003.

  • ——, “User cooperation diversity. Part II. Implementation aspects andperformance analysis,” IEEE Trans. Commun.,Nov. 2003.

  • J. N. Laneman, D. N. C. Tse, and G. W. Wornell,“Cooperative diversityin wireless networks: Efficientprotocols and outage behavior,” IEEETrans. Inform. Theory, Dec. 2004.

  • M.A. Khojastepour, A. Sabharwal, and B. Aazhang, “On capacity of Gaussian ‘cheap’ relay channel,” GLOBECOM, Dec.2003.


Reading for set 1
Reading for Set 1

  • Tse and Viswanath

    • Chapters 5.1-5.3, 3.1

    • Appendices A, B.1-B.5

  • Goldsmith

    • Chapters 1, 4.1,5

    • Appendices A, B, C


Single link awgn channel
Single Link AWGN Channel

  • Model

    where r(t) is the baseband received signal, b(t) is the information bearing signal, and n(t) is noise.

  • The signal b(t) is assumed to be band-limited to W.

  • The time period is assumed to be T.

  • The dimension of signal is N=2WT


Signal dimensions
Signal Dimensions

  • A signal with bandwidth W sampled at the Nyquist rate.

  • W complex (independent) samples per second.

  • Each complex sample is one dimension or degree of freedom.

  • Signal of duration T and bandwidth W has 2WT real degrees of freedom and can be represented 2WT real dimensions


Signals in time domain
Signals in Time Domain

  • Sampled at Nyquist rate

  • Example: three independent samples per second means three degrees of freedom

Voltage

1 second

time

1/W


Signal in frequency domain
Signal in Frequency Domain

  • Bandwidth W at carrier frequency fc

Power

Carrier frequency fc

frequency

W


Baseband signal in frequency domain
Baseband Signal in Frequency Domain

  • Passband signal down converted

  • Bandwidth W

Power

frequency

W


Sampling
Sampling

  • The baseband signal sampled at rate W

    Where

  • Sinc function is an example of expansion basis


Model
Model

  • There are N orthonormal basis functions to represent the information signal space.

  • For example,

  • The discrete time version


Noise
Noise

  • Assumed to be a Gaussian process

    • Zero mean

    • Wide sense stationary

    • Flat power spectral density with height

  • Passed through a filter with BW of W

    • Samples at the rate W are Gaussian

    • Samples are independent


Noise1
Noise

  • Projection of noise

  • Projections, nionto orthonormal bases fi(t) are

    • zero mean

    • Gaussian

    • Variance


Noise2
Noise

  • The samples of noise are Gaussian and independent

  • The received signal given the information samples are also Gaussian


Model1
Model

  • The discrete time formulation can come from sampling the received signal at the Nyquist rate of W

  • The final model

  • The discrete time model could have come from projection or simple sampling


Statistical model
Statistical Model

  • Key part of the model

  • The discrete time received signals are independent since noise is assumed white


Entropy
Entropy

  • Differential entropy

  • Differential conditional entropy

    with


Example
Example

  • A Gaussian random variable with mean and variance

  • The differential entropy is

  • If complex then it is

  • Among all random variables with fixed variance Gaussian has the largest differential entropy


Proof
Proof

  • Consider two zero mean random variables X and Y with the same variance

  • Assume X is Gaussian

Variance of X


Proof1
Proof

  • Kullback-Leibler distance

    Due to Gibbs inequality!


Gibbs inequality
Gibbs’ Inequality

  • The KL distance is nonnegative


Capacity
Capacity

  • Formally defined by Shannon as

    where the mutual information

    with


Capacity1
Capacity

  • Maximum reliable rate of information through the channel with this model.

  • In our model


Mutual information
Mutual Information

  • Information flow

Channel

Useful Information

Maximum

useful information

Noise; useless information


Capacity2
Capacity

  • In this model

    the maximum is achieved when information vector has mutually independent and Gaussian distributed elements.


Awgn channel capacity
AWGN Channel Capacity

  • The average power of information signal

  • The noise variance


Awgn capacity
AWGN Capacity

  • The original Shannon formula per unit time

  • An alternate with energy per bit


Achievable rate and converse
Achievable Rate and Converse

  • Construct codebook with

  • N-dimensional space

  • Law of large numbers

  • Sphere packing


Sphere packing
Sphere Packing

  • Number of spheres (ratio of volumes)

  • Non overlapping

    • As N grows the probability

      of codeword error vanishes

  • Higher rates not

    possible without overlap


Achievable rate and converse1
Achievable Rate and Converse

  • Construct codebook with

    bits in N channel use


Achieving capacity
Achieving Capacity

  • The information vector should be mutually independent with Gaussian distribution

  • The dimension N should be large

    • Complexity

  • Source has information to transmit

    • Full buffer

  • Channel is available

    • No contention for access

    • Point to point


Achieving capacity1
Achieving Capacity

  • Accurate model

    • Statistical

      • Noise

    • Deterministic

      • Linear channel

  • Signal model at the receiver

    • Timing

    • Synchronization


Approaching capacity
Approaching Capacity

  • High SNR:

    • Coded modulation with large constellation size

    • Large constellation with binary codes

  • Low SNR:

    • Binary modulation

    • Turbo coding

    • LDPC coding