1 / 45

R. Werner Solar Terrestrial Influences Institute - BAS

Time Series Analysis by means of inference statistical methods. R. Werner Solar Terrestrial Influences Institute - BAS. Inference statistic analysis of the time series Now: measures about the significance

alvaros
Download Presentation

R. Werner Solar Terrestrial Influences Institute - BAS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Time Series Analysis by means of inference statistical methods R. Werner Solar Terrestrial Influences Institute - BAS

  2. Inference statistic analysis of the time series Now: measures about the significance extrapolated trends causal relations between two variables Cross-section analysis: Y is a realization of a stochastic process, for example the errors must have a determined probability distribution Time series analysis: Prognosis for yt+1 , the influences of exogenous parameters can be investigated on this basis

  3. A model that describes probability structures is called stochastic process. The model includes assumptions for the mechanisms generating the observed time series. A general assumption is the stationarity: weakly stationarity 4a) Autocovariances with a lag greater than k are assumed zero → moving average models 4b) Autocovariances of a higher order can be calculated by variances of a lower order → autoregressive models

  4. Wavelet transformation, MMNR*100 hab data

  5. Autoregressive (AR) models of order p at error term: white noise

  6. AR(1) process

  7. ACF for AR(1) ACF for AR(1) Theoretical autocorrelation functions (ACF)

  8. Yule-Walker equations AR(1) AR(2) AR(p)

  9. 1 0 -1 AR(2) φ2 2 0 2 φ1 Conditions of stationarity: In the area under the circle the AR(2) model describes a quasi-cycle process

  10. z(1)=1 z(2)=2 time

  11. time

  12. Model identification tool partial autocorrelation function (PACF), as known from the cross-section statistics Step wise calculation of the coefficients from the Yule-Walker equations k=1: k=2: The theoretical PACF of an AR(p) process has values different from zero, only for k=1,2,…,p !

  13. Theoretical autocorrelation functions (ACF) and partial autocorrelation function for an AR(2) process φ1=1.7 φ2=-0.95 φ1=1.7 φ2=-0.95

  14. Yule 1927

  15. Parameter estimation Residues

  16. Distribution of the residues Autocorrelation function of the residues

  17. Moving-Average (MA) models AR models describe processes as a function of past z values, however as was shown for the AR(2) process z=1.7zt-1-0.95zt-2 the process is forced by the noise at. (with a theoretical infinite influence of the shocks). Now the idea is: as for the AR-process, to minimize the process parameters of finite series of atwith time lags

  18. Autocorrelation for a MA(1) process for a MA(2) process

  19. 1 0 -1 МА(2) θ2 θ1 2 0 2 Invertibility condition: In the area under the circle the MA(2) model describes a quasi-cycle process Empiric ACF is a tool to identification of the MA order PACF?

  20. Invertibility condition: For a MA(1) process we have

  21. The MA(1) process can be presented by an AR( ) process In general MA(q) process can be presented by an AR( ) process and an AR(p) process can be presented by a MAR( ) process Box-Jenkins Principle: models with minimum number of parameters have to be used

  22. Other models: ARMA: mixed model of AR and MA ARIMA: autoregressive integrating moving-average model it uses differences of the time series SARIMA: seasonal ARIMA model with constant seasonal figure VARMA : vector ARMA

  23. Forecast AR(1)

  24. MA(1) It can be shown that The MA models are not useful for prognosis

  25. Forecast of the SSNby an AR(9) model

  26. Dynamical regression Ordinary linear regression: Xi may be transformed Yi is normal distributed, for Xiit isnot necessary to be stochastically distributed (for ex. can be fixed) α and β can be optimally estimated by Ordinary Least Squares (OLS) using the assumptions: • E(εi)=0 • εiisnot autocorrelated Cov(εi , εj )=0 • εiis normally distributed • Equilibrium conditions

  27. For time series can be formally written ( i →t ): • The assumption of equilibrium is not necessary • However: • In time series the error term is often autocorrelated • The estimations are not efficient (they have not the minimal variance) • Autocorrelations of Xi can be transferred to ε, autocorrelations of εproduce deviations ofσεfrom the true value, besides this implicates a not true value of σβ γ autocorr. of the residues λ autocorr. of the predictors

  28. Simple lag model (models, dynamical by X) Distributed lag model the influence is distributed over k lags for example k=2 The statistical interpretation of β do not make sense

  29. Therefore more restrictions are needed For the model where the influence decreases exponentially with k. Then the model has only three parameters:α,β0 ,δ

  30. How do determine the parameters? Koyck transformation where Using OLS, δ and β0 can be estimated, and after this βk

  31. Similar models Adaptive expectation model Partial adjustment model Models with two or more input variables

  32. Model with autocorrelative error term We remember, that εtin lin. regr. has to be N(0,σ). Here εt is an AR(1) process. Estimation of the regr. coeff. by Cochrane/Orcutt method 1. By OLS estimation ofαand βand calculation of the residues et and estimation of the autocorrelation coeff.

  33. new regr. equation where Note: to test if εt is autocorrelated, the Durbin-Watson test can be applied

  34. e =0.0626*t-121.35

  35. Autocorrelation function of the detrended residues

  36. Partial Autocorrelation function of the detrended residues

  37. Acknowledgement I want to acknowledge to the Ministery of Education and Science to support this work under the contract DVU01/0120

More Related