time series models n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Time Series Models PowerPoint Presentation
Download Presentation
Time Series Models

Loading in 2 Seconds...

play fullscreen
1 / 76

Time Series Models - PowerPoint PPT Presentation


  • 172 Views
  • Uploaded on

Time Series Models. Stochastic processes Stationarity White noise Random walk Moving average processes Autoregressive processes More general processes. Topics. Stochastic Processes. 1. Time series are an example of a stochastic or random process

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

Time Series Models


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
    Presentation Transcript
    1. Time SeriesModels

    2. Stochastic processes Stationarity White noise Random walk Moving average processes Autoregressive processes More general processes Topics

    3. Stochastic Processes 1

    4. Time series are an example of a stochastic or random process A stochastic process is 'a statistical phenomenen that evolves in timeaccording to probabilistic laws' Mathematically, a stochastic process is an indexed collection of random variables Stochastic processes

    5. We are concerned only with processes indexed by time, either discrete time or continuous time processes such as Stochastic processes

    6. We base our inference usually on a single observation or realization of the process over some period of time, say [0, T] (a continuous interval of time) or at a sequence of time points {0, 1, 2, . . . T} Inference

    7. To describe a stochastic process fully, we must specify the all finite dimensional distributions, i.e. the joint distribution of of the random variables for any finite set of times Specification of a process

    8. A simpler approach is to only specify the moments—this is sufficient if all the joint distributions are normal The mean and variance functions are given by Specification of a process

    9. Because the random variables comprising the process are not independent, we must also specify their covariance Autocovariance

    10. Stationarity 2

    11. Inference is most easy, when a process is stationary—its distribution does not change over time This is strict stationarity A process is weakly stationary if its mean and autocovariance functions do not change over time Stationarity

    12. The autocovariance depends only on the time difference or lag between the two time points involved Weak stationarity

    13. It is useful to standardize the autocovariance function (acvf) Consider stationary case only Use the autocorrelation function (acf) Autocorrelation

    14. More than one process can have the same acf Properties are: Autocorrelation

    15. White Noise 3

    16. This is a purely random process, a sequence of independent and identically distributed random variables Has constant mean and variance Also White noise

    17. Random Walk 3

    18. Start with {Zt} being white noise or purely random {Xt} is a random walk if Random walk

    19. The random walk is not stationary First differences are stationary Random walk

    20. Moving Average Processes 4

    21. Start with {Zt} being white noise or purely random, mean zero, s.d. Z {Xt} is a moving average process of order q (written MA(q)) if for some constants 0, 1, . . . q we have Usually 0 =1 Moving average processes

    22. The mean and variance are given by The process is weakly stationary because the mean is constant and the covariance does not depend on t Moving average processes

    23. If the Zt's are normal then so is the process, and it is then strictly stationary The autocorrelation is Moving average processes

    24. Note the autocorrelation cuts off at lag q For the MA(1) process with 0 = 1 Moving average processes

    25. In order to ensure there is a unique MA process for a given acf, we impose the condition of invertibility This ensures that when the process is written in series form, the series converges For the MA(1) process Xt = Zt + Zt - 1, the condition is ||< 1 Moving average processes

    26. For general processes introduce the backward shift operator B Then the MA(q) process is given by Moving average processes

    27. The general condition for invertibility is that all the roots of the equation  lie outside the unit circle (have modulus less than one) Moving average processes

    28. Autoregressive Processes 4

    29. Assume {Zt} is purely random with mean zero and s.d. z Then the autoregressive process of order p or AR(p) process is Autoregressive processes

    30. The first order autoregression is Xt = Xt - 1 + Zt Provided ||<1 it may be written as an infinite order MA process Using the backshift operator we have (1 – B)Xt = Zt Autoregressive processes

    31. Autoregressive processes • From the previous equation we have

    32. Autoregressive processes • Then E(Xt) = 0, and if ||<1

    33. Autoregressive processes • The AR(p) process can be written as

    34. This is for for some 1, 2, . . . This gives Xt as an infinite MA process, so it has mean zero Autoregressive processes

    35. Conditions are needed to ensure that various series converge, and hence that the variance exists, and the autocovariance can be defined Essentially these are requirements that the i become small quickly enough, for large i Autoregressive processes

    36. The i may not be able to be found however. The alternative is to work with the i The acf is expressible in terms of the roots i, i=1,2, ...p of the auxiliary equation Autoregressive processes

    37. Then a necessary and sufficient condition for stationarity is that for every i, |i|<1 An equivalent way of expressing this is that the roots of the equation must lie outside the unit circle Autoregressive processes

    38. Combine AR and MA processes An ARMA process of order (p,q) is given by ARMA processes

    39. Alternative expressions are possible using the backshift operator ARMA processes

    40. An ARMA process can be written in pure MA or pure AR forms, the operators being possibly of infinite order Usually the mixed form requires fewer parameters ARMA processes

    41. General autoregressive integrated moving average processes are called ARIMA processes When differenced say d times, the process is an ARMA process Call the differenced process Wt. Then Wt is an ARMA process and ARIMA processes

    42. Alternatively specify the process as This is an ARIMA process of order (p,d,q) ARIMA processes

    43. The model for Xt is non-stationary because the AR operator on the left hand side has d roots on the unit circle d is often 1 Random walk is ARIMA(0,1,0) Can include seasonal terms—see later ARIMA processes

    44. We have assumed that the mean is zero in the ARIMA models There are two alternatives mean correct all the Wt terms in the model incorporate a constant term in the model Non-zero mean

    45. The Box-JenkinsApproach

    46. Outline of the approach Sample autocorrelation & partial autocorrelation Fitting ARIMA models Diagnostic checking Example Further ideas Topics

    47. Outline of theBox-Jenkins Approach 1

    48. The approach is an iterative one involving model identification model fitting model checking If the model checking reveals that there are problems, the process is repeated Box-Jenkins approach

    49. Models to be fitted are from the ARIMA class of models (or SARIMA class if the data are seasonal) The major tools in the identification process are the (sample) autocorrelation function and partial autocorrelation function Models

    50. Use the sample autocovariance and sample variance to estimate the autocorrelation The obvious estimator of the autocovariance is Autocorrelation