1 / 19

EE 561 Communication Theory Spring 2003

EE 561 Communication Theory Spring 2003. Instructor: Matthew Valenti Date: Jan.17, 2003 Lecture #3 Random Processes. Review/Preview. Last time: Review of probability and random variables. Random variables, CDF, pdf, expectation. Pairs of RVs, random vectors, autocorrelation, covariance.

aimee
Download Presentation

EE 561 Communication Theory Spring 2003

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. EE 561Communication TheorySpring 2003 Instructor: Matthew Valenti Date: Jan.17, 2003 Lecture #3 Random Processes

  2. Review/Preview • Last time: • Review of probability and random variables. • Random variables, CDF, pdf, expectation. • Pairs of RVs, random vectors, autocorrelation, covariance. • Uniform, Gaussian, Bernoulli, and binomial RVs. • This time: • Random processes. • Upcoming assignments: • HW #1 is due in 1 week. • Computer Assignment #1 will be posted soon.

  3. Random Variablesvs. Random Processes • Random variables model unknown values. • Random variables are numbers. • Random processes model unknown signals. • Random processes are functions of time. • One Interpretation: A random process is just a collection of random variables. • A random process evaluated at a specific time t is a random variable. • If X(t) is a random process then X(1), X(1.5), and X(37.5) are all random variables.

  4. Random Variables • Random variables map the outcome of a random experiment to a number. S heads tails 0 1 X

  5. sample function ensemble A random process evaluated at a particular time is a random variable Random Processes Random Processes map the outcome of a random experiment to a signal (function of time). signal associated with the outcome: S heads tails

  6. Random Process Terminology • The expected value, ensemble average or mean of a random process is: • The autocorrelation function (ACF) is: • Autocorrelation is a measure of how alike the random process is from one time instant to another. • Autocovariance:

  7. Mean and Autocorrelation • Finding the mean and autocorrelation is not as hard as it might appear! • Why: because oftentimes a random process can be expressed as a function of a random variable. • We already know how to work with functions of random variables. • Example: • This is just a function g() of : • We know how to find the expected value of a function of a random variable: • To find this you need to know the pdf of . a random variable

  8. An Example • If  is uniform between 0 and , then:

  9. Stationarity • A process is strict-sense stationary (SSS) if all its joint densities are invariant to a time shift: • in general, it is difficult to prove that a random process is strict sense stationary. • A process is wide-sense stationary (WSS) if: • The mean is a constant: • The autocorrelation is a function of time difference only: • If a process is strict-sense stationary, then it is also wide-sense stationary.

  10. Properties of the Autocorrelation Function • If x(t) is Wide Sense Stationary, then its autocorrelation function has the following properties: • Examples: • Which of the following are valid ACF’s? this is the second moment even symmetry

  11. Power Spectral Density • Power Spectral Density (PSD) is a measure of a random process’ power content per unit frequency. • Denoted (f). • Units of W/Hz. • (f) is nonnegative function. • For real-valued processes, (f) is an even function. • The total power of the process if found by: • The power within bandwidth B is found by:

  12. Wiener-Khintchine Theorem • We can easily find the PSD of a WSS random processes. • Wiener-Khintchine theorem: • If x(t) is a wide sense stationary random process, then: • i.e. the PSD is the Fourier Transform of the ACF. • Example: • Find the PSD of a WSS R.P with autocorrelation:

  13. Example:

  14. White Gaussian Noise • A process is Gaussian if any n samples placed into a vector form a Gaussian vector. • If a Gaussian process is WSS then it is SSS. • A process is white if the following hold: • WSS. • zero-mean, i.e. mx(t) = 0. • Flat PSD, i.e. (f) = constant. • A white Gaussian noise process: • Is Gaussian. • Is white. • The PSD is (f) =N0/2 • N0/2 is called the two-sided noise spectral density. • Since it is WSS+Gaussian, then it is also SSS.

  15. Linear Systems • The output of a linear time invariant (LTI) system is found by convolution. • However, if the input to the system is a random process, we can’t find X(f). • Solution: use power spectral densities: • This implies that the output of a LTI system is WSS if the input is WSS. x(t) y(t) h(t)

  16. Example • A white Gaussian noise process with PSD of (f) =N0/2 = 10-5 W/Hz is passed through an ideal lowpass filter with cutoff at 1 kHz. • Compute the noise power at the filter output.

  17. time average operator: Ergodicity • A random process is said to be ergodic if it is ergodic in the mean and ergodic in correlation: • Ergodic in the mean: • Ergodic in the correlation: • In order for a random process to be ergodic, it must first be Wide Sense Stationary. • If a R.P. is ergodic, then we can compute power three different ways: • From any sample function: • From the autocorrelation: • From the Power Spectral Density:

  18. Cross-correlation • If we have two random processes x(t) and y(t) we can define a cross-correlation function: • If x(t) and y(t) are jointly stationary, then the cross-correlation becomes: • If x(t) and y(t) areuncorrelated, then: • If x(t) and y(t) are independent, then they are also uncorrelated, and thus:

  19. Summary of Random Processes • A random process is a random function of time. • Or conversely, an indexed set of random variables. • A particular realization of a random process is called a sample function. • Furthermore, a Random Process evaluated at a particular point in time is a Random Variable. • A random process is ergodicin the mean if the time average of every sample function is the same as the expected value of the random process at any time.

More Related