1 / 59

S tochastic processes

S tochastic processes . Lecture 7 Linear time invariant systems. Random process. 1 st order Distribution & density function. First-order distribution First-order density function. 2 end order Distribution & density function. 2 end order distribution 2 end order density function.

eitan
Download Presentation

S tochastic processes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Stochastic processes Lecture 7 Linear time invariant systems

  2. Random process

  3. 1st order Distribution & density function First-order distribution First-order density function

  4. 2end order Distribution & density function 2end order distribution 2end order density function

  5. EXPECTATIONS • Expected value • The autocorrelation

  6. Some random processes • Single pulse • Multiple pulses • Periodic Random Processes • The Gaussian Process • The PoissonProcess • Bernoulli and Binomial Processes • The RandomWalk Wiener Processes • The MarkovProcess

  7. Recap: Power spectrum density

  8. Power spectrum density • Since the integral of the squared absolute Fourier transform contains the full power of the signal it is a density function. • So the power spectral density of a random process is: • Due to absolute factor the PSD is always real

  9. Power spectrum density • The PSD is a density function. • In the case of the random process the PSD is the density function of the random process and not necessarily the frequency spectrum of a single realization. • Example • A random process is defined as • Where ωr is a unifom distributed random variable wiht a range from 0-π • What is the PSD for the process and • The power sepctrum for a single realization

  10. Properties of the PSD • Sxx(f) is real and nonnegative • The average power in X(t) is given by: • If X(t) is real Rxx(τ) and Sxx(f) are also even • If X(t) has periodic components Sxx(f)has impulses • Independent on phase

  11. Wiener-Khinchin 1 • If the X(t) is stationary in the wide-sense the PSD is the Fourier transform of the Autocorrelation

  12. Wiener-KhinchinTwo method for estimation of the PSD Fourier Transform |X(f)|2 X(t) Sxx(f) Fourier Transform Autocorrelation

  13. The inverse Fourier Transform of the PSD • Since the PSD is the Fourier transformed autocorrelation • The inverse Fourier transform of the PSD is the autocorrelation

  14. Cross spectral densities • If X(t) and Y(t) are two jointly wide-sense stationary processes, is the Cross spectral densities • Or

  15. Properties of Cross spectral densities • Since is • Syx(f) is not necessary real • If X(t) and Y(t) are orthogonal Sxy(f)=0 • If X(t) and Y(t) are independent Sxy(f)=E[X(t)] E[Y(t)] δ(f)

  16. Cross spectral densities example • 1Hz Sinus curves in white noise Where w(t) is Gaussian noise

  17. The periodogramThe estimate of the PSD • The PSD can be estimate from the autocorrelation • Or directly from the signal

  18. Bias in the estimates of the autocorrelation N=12

  19. Variance in the PSD • The variance of the periodogram is estimated to the power of two of PSD

  20. Averaging • Divide the signal into K segments of M length • Calculate the periodogram of each segment • Calculate the average periodogram

  21. Illustrations of Averaging

  22. PSD units • Typical units: • Electrical measurements: V2/Hz or dB V/Hz • Sound: Pa2/Hz or dB/Hz • How to calculate dB I a power spectrum: PSDdB(f) = 10 log10 { PSD(f)  } .

  23. Agenda (Lec. 7) • Recap: Linear time invariant systems • Stochastic signals and LTI systems • Mean Value function • Mean square value • Cross correlation function between input and output • Autocorrelation function and spectrum output • Filter examples • Intro to system identification

  24. Focus continuous signals and system Continuous signal: Discrete signal:

  25. Systems

  26. Nonlinearsystems Linear system 20 25 18 20 16 15 14 10 12 y(t) 5 y(t) 10 0 8 6 -5 4 -10 2 x[n] 2 Ö x[n] -15 20 log(x[n]) 0 0 1 2 3 4 5 -20 0 1 2 3 4 5 x(t) x(t) Recap: Linear time invariant systems (LTI) • What is a Linear system: • The system applies to superposition

  27. Recap: Linear time invariant systems (LTI) • Time invariant: • A time invariant systems is independent on explicit time • (The coefficient are independent on time) • That means If: y2(t)=f[x1(t)] Then: y2(t+t0)=f[x1(t+t0)] The same to Day tomorrow and in 1000 years A non Time invariant 20 years 45 years 70 years

  28. Examples • A linear system y(t)=3 x(t) • A nonlinear system y(t)=3 x(t)2 • A time invariant system y(t)=3 x(t) • A time variant system y(t)=3tx(t)

  29. The impulse response The output of a system if Dirac delta is input T{∙}

  30. Convolution • The output of LTI system can be determined by the convoluting the input with the impulse response

  31. Fourier transform of the impulse response • The Transfer function (System function) is the Fourier transformed impulse response • The impulse response can be determined from the Transfer function with the invers Fourier transform

  32. Fourier transform of LTI systems • Convolution corresponds to multiplication in the frequency domain Time domain = * Frequency domain = x

  33. Causal systems • Independent on the future signal

  34. Stochastic signals and LTI systems • Estimation of the output from a LTI system when the input is a stochastic process Α is a delay factor like τ

  35. Statistical estimates of output • The specific distribution function fX(x,t) is difficult to estimate. Therefor we stick to • Mean • Autocorrelation • PSD • Mean square value.

  36. Expected Value of Y(t) (1/2) • How do we estimate the mean of the output? If mean of x(t) is defined as mx(t)

  37. Expected Value of Y(t) (2/2) If x(t) is wide sense stationary Alternative estimate: At 0 Hz the transfer function is equal to the DC gain Therefor:

  38. Expected Mean square value (1/2)

  39. Expected Mean square value (2/2) By substitution: If X(t)is WSS Thereby the Expected Mean square value is independent on time

  40. Cross correlation function between input and output • Can we estimate the Cross correlation between input and out if X(t) is wide sense stationary Thereby the cross-correlation is the convolution between the auto-correlation of x(t) and the impulse response

  41. Autocorrelation of the output (1/2) Y(t) and Y(t+τ) is :

  42. Autocorrelation of the output (2/2) By substitution: α=-β Remember:

  43. Spectrum of output • Given: • The power spectrum is = x

  44. Filter examples

  45. Typical LIT filters • FIR filters (Finite impulse response) • IIR filters (Infinite impulse response) • Butterworth • Chebyshev • Elliptic

  46. Ideal filters • Highpass filter • Band stop filter • Bandpassfilter

  47. Filter types and rippels

  48. Analog lowpass Butterworth filter • Is ”all pole” filter • Squared frequency transfer function • N:filter order • fc: 3dB cut off frequency • Estimate PSD from filter

  49. Chebyshev filter type I • Transfer function • Where ε is relateret to ripples in the pass band • Where TN is a N order polynomium

  50. Transformation of a low pass filter to other types (the s-domain) Old Cutoff frequency Lowest Cutoff frequency New Cutoff frequency Highest Cutoff frequency

More Related