wireless communication elec 534 set i september 9 2007
Download
Skip this Video
Download Presentation
Wireless Communication Elec 534 Set I September 9, 2007

Loading in 2 Seconds...

play fullscreen
1 / 44

Wireless Communication Elec 534 Set I September 9, 2007 - PowerPoint PPT Presentation


  • 101 Views
  • Uploaded on

Wireless Communication Elec 534 Set I September 9, 2007. Behnaam Aazhang. The Course. Light homework Team project Individual paper presentations Mid October Team project presentations Early December. Multiuser Network. Multiple nodes with information. Outline.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Wireless Communication Elec 534 Set I September 9, 2007' - Anita


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
the course
The Course
  • Light homework
  • Team project
    • Individual paper presentations
      • Mid October
    • Team project presentations
      • Early December
multiuser network
Multiuser Network
  • Multiple nodes with information
outline
Outline
  • Transmission over simple channels
    • Information theoretic approach
    • Fundamental limits
    • Approaching capacity
  • Fading channel models
    • Multipath
    • Rayleigh
    • Rician
outline1
Outline
  • Transmission over fading channels
    • Information theoretic approach
    • Fundamental limits
    • Approaching achievable rates
  • Communication with “additional” dimensions
    • Multiple input multiple (MIMO)
      • Achievable rates
      • Transmission techniques
    • User cooperation
      • Achievable rates
      • Transmission techniques
outline2
Outline
  • Wireless network
    • Cellular radios
    • Multiple access
      • Achievable rate region
      • Multiuser detection
    • Random access
why information theory
Why Information Theory?
  • Information is modeled as random
  • Information is quantified
  • Transmission of information
    • Model driven
    • Reliability measured
    • Rate is established
information
Information
  • Entropy
    • Higher entropy (more random) higher information content
  • Random variable
    • Discrete
    • Continuous
communication
Communication
  • Information transmission
  • Mutual information

Channel

Useful Information

Maximum

useful information

Noise; useless information

wireless
Wireless

Interference

  • Information transmission

Channel

Useful Information

Maximum

useful information

Randomness

due to channel

Noise; useless information

multiuser network1
Multiuser Network
  • Multiple nodes with information
references
References
  • C.E. Shannon, W. Weaver, A Mathematical Theory Communication, 1949.
  • T.M. Cover and J. Thomas, Elements of Information Theory, 1991.
  • R. Gallager, Information Theory and Reliable Communication, 1968.
  • J. Proakis, Digital Communication, 4th edition
  • D. Tse and P. Viswanath, Fundamentals of Wireless Communication, 2005.
  • A. Goldsmith “Wireless Communication” Cambridge University Press 2005
references1
References
  • E. Biglieri, J. Proakis, S. Shamai, Fading Channels: Information Theoretic and Communications, IEEE IT Trans.,1999.
  • A. Goldsmith, P. Varaiya, Capacity of Fading Channels with Channel Side Information, IEEE IT Trans. 1997.
  • I. Telatar, Capacity of Multi-antenna Gaussian Channels, European Trans. Telecomm, 1999.
  • A. Sendonaris, E. Erkip, and B. Aazhang, “User cooperation diversity, Part I. Systemdescription,” IEEE Trans. Commun.,Nov. 2003.
  • ——, “User cooperation diversity. Part II. Implementation aspects andperformance analysis,” IEEE Trans. Commun.,Nov. 2003.
  • J. N. Laneman, D. N. C. Tse, and G. W. Wornell,“Cooperative diversityin wireless networks: Efficientprotocols and outage behavior,” IEEETrans. Inform. Theory, Dec. 2004.
  • M.A. Khojastepour, A. Sabharwal, and B. Aazhang, “On capacity of Gaussian ‘cheap’ relay channel,” GLOBECOM, Dec.2003.
reading for set 1
Reading for Set 1
  • Tse and Viswanath
    • Chapters 5.1-5.3, 3.1
    • Appendices A, B.1-B.5
  • Goldsmith
    • Chapters 1, 4.1,5
    • Appendices A, B, C
single link awgn channel
Single Link AWGN Channel
  • Model

where r(t) is the baseband received signal, b(t) is the information bearing signal, and n(t) is noise.

  • The signal b(t) is assumed to be band-limited to W.
  • The time period is assumed to be T.
  • The dimension of signal is N=2WT
signal dimensions
Signal Dimensions
  • A signal with bandwidth W sampled at the Nyquist rate.
  • W complex (independent) samples per second.
  • Each complex sample is one dimension or degree of freedom.
  • Signal of duration T and bandwidth W has 2WT real degrees of freedom and can be represented 2WT real dimensions
signals in time domain
Signals in Time Domain
  • Sampled at Nyquist rate
  • Example: three independent samples per second means three degrees of freedom

Voltage

1 second

time

1/W

signal in frequency domain
Signal in Frequency Domain
  • Bandwidth W at carrier frequency fc

Power

Carrier frequency fc

frequency

W

baseband signal in frequency domain
Baseband Signal in Frequency Domain
  • Passband signal down converted
  • Bandwidth W

Power

frequency

W

sampling
Sampling
  • The baseband signal sampled at rate W

Where

  • Sinc function is an example of expansion basis
model
Model
  • There are N orthonormal basis functions to represent the information signal space.
  • For example,
  • The discrete time version
noise
Noise
  • Assumed to be a Gaussian process
    • Zero mean
    • Wide sense stationary
    • Flat power spectral density with height
  • Passed through a filter with BW of W
    • Samples at the rate W are Gaussian
    • Samples are independent
noise1
Noise
  • Projection of noise
  • Projections, nionto orthonormal bases fi(t) are
    • zero mean
    • Gaussian
    • Variance
noise2
Noise
  • The samples of noise are Gaussian and independent
  • The received signal given the information samples are also Gaussian
model1
Model
  • The discrete time formulation can come from sampling the received signal at the Nyquist rate of W
  • The final model
  • The discrete time model could have come from projection or simple sampling
statistical model
Statistical Model
  • Key part of the model
  • The discrete time received signals are independent since noise is assumed white
entropy
Entropy
  • Differential entropy
  • Differential conditional entropy

with

example
Example
  • A Gaussian random variable with mean and variance
  • The differential entropy is
  • If complex then it is
  • Among all random variables with fixed variance Gaussian has the largest differential entropy
proof
Proof
  • Consider two zero mean random variables X and Y with the same variance
  • Assume X is Gaussian

Variance of X

proof1
Proof
  • Kullback-Leibler distance

Due to Gibbs inequality!

gibbs inequality
Gibbs’ Inequality
  • The KL distance is nonnegative
capacity
Capacity
  • Formally defined by Shannon as

where the mutual information

with

capacity1
Capacity
  • Maximum reliable rate of information through the channel with this model.
  • In our model
mutual information
Mutual Information
  • Information flow

Channel

Useful Information

Maximum

useful information

Noise; useless information

capacity2
Capacity
  • In this model

the maximum is achieved when information vector has mutually independent and Gaussian distributed elements.

awgn channel capacity
AWGN Channel Capacity
  • The average power of information signal
  • The noise variance
awgn capacity
AWGN Capacity
  • The original Shannon formula per unit time
  • An alternate with energy per bit
achievable rate and converse
Achievable Rate and Converse
  • Construct codebook with
  • N-dimensional space
  • Law of large numbers
  • Sphere packing
sphere packing
Sphere Packing
  • Number of spheres (ratio of volumes)
  • Non overlapping
    • As N grows the probability

of codeword error vanishes

  • Higher rates not

possible without overlap

achievable rate and converse1
Achievable Rate and Converse
  • Construct codebook with

bits in N channel use

achieving capacity
Achieving Capacity
  • The information vector should be mutually independent with Gaussian distribution
  • The dimension N should be large
    • Complexity
  • Source has information to transmit
    • Full buffer
  • Channel is available
    • No contention for access
    • Point to point
achieving capacity1
Achieving Capacity
  • Accurate model
    • Statistical
      • Noise
    • Deterministic
      • Linear channel
  • Signal model at the receiver
      • Timing
      • Synchronization
approaching capacity
Approaching Capacity
  • High SNR:
    • Coded modulation with large constellation size
    • Large constellation with binary codes
  • Low SNR:
    • Binary modulation
    • Turbo coding
    • LDPC coding
ad