1 / 91

Linear Filters

Linear Filters. Let. denote a bivariate time series with zero mean. Suppose that the time series { y t : t  T } is constructed as follows:. The time series { y t : t  T } is said to be constructed from { x t : t  T } by means of a Linear Filter.

mmarple
Download Presentation

Linear Filters

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Linear Filters

  2. Let denote a bivariate time series with zero mean.

  3. Suppose that the time series {yt : tT} is constructed as follows: The time series {yt : t T} is said to be constructed from {xt : tT} by means of a Linear Filter.

  4. The autocovariance function of the filtered series

  5. Thus the spectral density of the time series {yt : tT} is:

  6. Comment A: is called the Transfer function of the linear filter. is called the Gain of the filter while is called the Phase Shift of the filter.

  7. Also

  8. Thus cross spectrum of the bivariate time series is:

  9. Definition: = Squared Coherency function Note:

  10. Comment B: = Squared Coherency function. if {yt : t T} is constructed from {xt : t T} by means of a linear filter

  11. Linear Filterswith additive noise at the output

  12. Let denote a bivariate time series with zero mean. Suppose that the time series {yt : tT} is constructed as follows: t = ..., -2, -1, 0, 1, 2, ... The noise {vt : tT} is independent of the series {xt : tT} (may be white)

  13. nt

  14. The autocovariance function of the filtered series with added noise

  15. continuing Thus the spectral density of the time series {yt : tT} is:

  16. Also

  17. Thus cross spectrum of the bivariate time series is:

  18. Thus = Squared Coherency function. Noise to Signal Ratio

  19. Box-Jenkins Parametric Modelling of a Linear Filter

  20. Consider the Linear Filter of the time series {Xt : tT}: where and = the Transfer function of the filter.

  21. {at : tT} is called the impulse response function of the filter since if X0 =1and Xt = 0 for t ≠ 0, then : for tT Xt at Linear Filter

  22. Also Note:

  23. Hence {DYt} and {DXt} are related by the same Linear Filter. Definition The Linear Filter is said to be stable if : converges for all |B| ≤1.

  24. Discrete Dynamic Models:

  25. Many physical systems whose output is represented by Y(t) are modeled by the following differential equation: Where X(t) is the forcing function.

  26. If X and Y are measured at discrete times this equation can be replaced by: where D = I-B denotes the differencing operator.

  27. This equation can in turn be represented with the operator B. or where

  28. This equation can also be written in the form as a Linear filter as Stability: It can easily be shown that this filter is stable if the roots of d(x) = 0 lie outside the unit circle.

  29. Linear filter of a time series Suppose that the time series {yt : tT} is constructed as follows: The time series {yt : t T} is said to be constructed from {xt : tT} by means of a Linear Filter.

  30. The spectral density of the filtered time series {yt : tT} is: is called the Transfer function is called the Gain of the filter while is called the Phase Shift of the filter.

  31. Thus cross spectrum of the bivariate time series Squared Coherency function if {yt : t T} is constructed from {xt : t T} by means of a linear filter

  32. Linear filter of a time series plus noise Let denote a bivariate time series with zero mean. Suppose that the time series {yt : tT} is constructed as follows: t = ..., -2, -1, 0, 1, 2, ... The noise {vt : tT} is independent of the series {xt : tT} (may be white)

  33. Then

  34. The Box-Jenkins model of aLinear filter as Stability: d(x) and p(x) are polynomials. a(x) is a power series. It can easily be shown that this filter is stable if the roots of d(x) = 0 lie outside the unit circle.

  35. The sequence {at : tT} is called the impulse response function of the filter since if X0 =1and Xt = 0 for t ≠ 0, then : for tT Xt at Linear Filter

  36. Determining the Impulse Response function from the Parameters of the Filter:

  37. Now or Hence

  38. Equating coefficients results in the following conclusions: aj = 0 for j < b. aj - d1aj-1 - d2aj-2-...- draj-r= wj or aj= d1aj-1 + d2aj-2+...+ draj-r+ wj for b ≤ j ≤ b+s. and aj- d1aj-1 - d2aj-2-...- draj-r= 0 or aj = d1aj-1 + d2aj-2+...+ draj-r for j > b+s.

  39. Thus the coefficients of the transfer function, a0, a1, a2,... , satisfy the following properties 1) b zeroesa0, a1, a2,..., ab-1 2) No pattern for the next s-r+1 values ab, ab+1, ab+2,..., ab+s-r 3) The remaining values ab+s-r+1, ab+s-r+2, ab+s-r+3,... follow the pattern of an rth order difference equation aj = d1aj-1 + d2aj-2+...+ dr aj-r

  40. Example r =1, s=2, b=3, d1 = d a0 = a1 = a2 = 0 a3 = da2 + w0 = w0 a4 = da3 + w1 = dw0 + w1 a5 = da4 + w2 = d[dw0 + w1] + w2 = d2w0 + dw1 + w2 aj= daj-1 for j ≥ 6.

  41. Transfer function {at}

  42. Identification of the Box-Jenkins Transfer Model with r=2

  43. Recall the solution to the second order difference equation aj = d1aj-1 + d2aj-2 follows the following patterns: • Mixture of exponentials if the roots of 1 - d1x - d2x2 = 0 are real. 2) Damped Cosine wave if the roots to 1 - d1x - d2x2 = 0 are complex. These are the patterns of the Impulse Response function one looks for when identifying b,r and s.

  44. Estimation of the Impulse Response function, aj (without pre-whitening).

  45. Suppose that {Yt : tT} and {Xt : t T}are weakly stationary time series satisfying the following equation: Also assume that {Nt : tT} is a weakly stationary "noise" time series, uncorrelated with {Xt : tT}. Then

  46. Suppose that for s > M, as= 0. Then a0, a1, ... ,aM can be found solving the following equations:

  47. If the Cross autocovariance function, sXY(h), and the Autocovariance function, sXX(h), are unknown they can be replaced by their sample estimates CXY(h) and CXX(h), yeilding estimates of the impluse response function

  48. In matrix notation this set of linear equations can be written:

  49. If the Cross autocovariance function, sXY(h), and the Autocovariance function, sXX(h), are unknown they can be replaced by their sample estimates CXY(h) and CXX(h), yeilding estimates of the impluse response function

More Related