1 / 23

ELEC 303 – Random Signals

ELEC 303 – Random Signals. Lecture 19 – Random processes Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 9, 2010. Lecture outline. Basic concepts Statistical averages, Autocorrelation function Wide sense stationary (WSS) Multiple random processes. Random processes.

rigel-head
Download Presentation

ELEC 303 – Random Signals

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ELEC 303 – Random Signals Lecture 19 – Random processes Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 9, 2010

  2. Lecture outline • Basic concepts • Statistical averages, • Autocorrelation function • Wide sense stationary (WSS) • Multiple random processes

  3. Random processes • A random process (RP) is an extension of a RV • Applied to random time varying signals • Example: “thermal noise” in circuits caused by the random movement of electrons • RP is a natural way to model info sources • RP is a set of possible realizations of signal waveforms governed by probabilistic laws • RP instance is a signal (and not just one number like the case of RV)

  4. Example 1 • A signal generator generates six possible sinusoids with amplitude one and phase zero. • We throw a die, corresponding to the value F, the sinusoid frequency = 100F • Thus, each of the possible six signals would be realized with equal probability • The random process is X(t)=cos(2  100F t)

  5. Example 2 • Randomly choose a phase  ~ U[0,2] • Generate a sinusoid with fixed amplitude (A) and fixed freq (f0) but a random phase  • The RP is X(t)= A cos(2f0t + )

  6. X(t)= A cos(2f0t + )

  7. Example 3 • X(t)=X • Random variable X~U[-1,1]

  8. Random processes • Corresponding to each i in the sample space , there is a signal x(t; i) called a sample function or a realization of the RP • For the different I’s at a fixed time t0, the number x(t0; i) constitutes a RV X(t0) • In other words, at any time instant, the value of a random process is a random variable

  9. Example: sample functions of a random process

  10. Example 4 • We throw a die, corresponding to the value F, the sinusoid frequency = 100F • Thus, each of the possible six signals would be realized with equal probability • The random process is X(t)=cos(2  100F t) • Determine the values of the RV X(0.001) • The possible values are cos(0.2), cos(0.4), …, cos(1.2) each with probability 1/6

  11. Example 5 •  is the sample space for throwing a die • For all i let x(t; i)= i e-1 • X is a RV taking values e-1, 2e-1, …, 6e-1, each with probability 1/6

  12. Example 6 • Example of a discrete-time random process • Let i denote the outcome of a random experiment of independent drawings from N(0,1) • The discrete–time RP is {Xn}n=1 to , X0=0, and Xn=Xn-1+ i for all n1

  13. Statistical averages • mX(t) is the mean, of the random process X(t) • At each t=t0, it is the mean of the RV X(t0) • Thus, mX(t)=E[X(t)] for all t • The PDF of X(t0) denoted by fX(t0)(x)

  14. Mean of a random process

  15. Example 7 • Randomly choose a phase  ~ U[0,2] • Generate a sinusoid with fixed amplitude (A) and fixed freq (f0) but a random phase  • The RP is X(t)= A cos(2f0t + ) • We can compute the mean • For [1,2], f()=1/2, and zero otherwise • E[X(t)]= {0 to 2} A cos(2f0t+)/2.d = 0

  16. Autocorrelation function • The autocorrelation function of the RP X(t) is denoted by RX(t1,t2)=E[X(t1)X(t2)] • RX(t1,t2) is a deterministic function of t1 and t2

  17. Example 8 • The autocorrelation of the RP in ex.7 is • We have used

  18. Example 9 • X(t)=X • Random variable X~U[-1,1] • Find the autocorrelation function

  19. Wide sense stationary process • A process is wide sense stationary (WSS) if its mean and autocorrelation do not depend on the choice of the time origin • WSS RP: the following two conditions hold • mX(t)=E[X(t)] is independent of t • RX(t1,t2) depends only on the time difference =t1-t2 and not on the t1 and t2 individually • From the definition, RX(t1,t2)=RX(t2,t1)  If RP is WSS, then RX()=RX(-)

  20. Example 8 (cont’d) • The autocorrelation of the RP in ex.7 is • Also, we saw that mX(t)=0 • Thus, this process is WSS

  21. Example 10 • Randomly choose a phase  ~ U[0,] • Generate a sinusoid with fixed amplitude (A) and fixed freq (f0) but a random phase  • The new RP is Y(t)= A cos(2f0t + ) • We can compute the mean • For [1,], f()=1/, and zero otherwise • MY(t) = E[Y(t)]= {0 to } A cos(2f0t+)/.d = -2A/ sin(2f0t) • Since mY(t) is not independent of t, Y(t) is nonstationary RP

  22. Multiple RPs • Two RPs X(t) and Y(t) are independent if for all t1 and t2, the RVs X(t1) and X(t2) are independent • Similarly, the X(t) and Y(t) are uncorrelated if for all t1 and t2, the RVs X(t1) and X(t2) are uncorrelated • Recall that independence  uncorrelation, but the reverse relationship is not generally true • The only exception is the Gaussian processes (TBD next time) were the two are equivalent

  23. Cross correlation and joint stationary • The cross correlation between two RPs X(t) and Y(t) is defined as RXY(t1,t2) = E[X(t1)X(t2)] clearly, RXY(t1,t2) = RXY(t2,t1) • Two RPs X(t) and Y(t) are jointly WSS if both are individually stationary and the cross correlation depends on =t1-t2  for X and Y jointly stationary, RXY() = RXY(-)

More Related