time series models l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Time Series Models PowerPoint Presentation
Download Presentation
Time Series Models

Loading in 2 Seconds...

play fullscreen
1 / 76

Time Series Models - PowerPoint PPT Presentation


  • 749 Views
  • Uploaded on

Time Series Models. Stochastic processes Stationarity White noise Random walk Moving average processes Autoregressive processes More general processes. Topics. Stochastic Processes. 1. Time series are an example of a stochastic or random process

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Time Series Models' - salena


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
topics
Stochastic processes

Stationarity

White noise

Random walk

Moving average processes

Autoregressive processes

More general processes

Topics
stochastic processes4
Time series are an example of a stochastic or random process

A stochastic process is 'a statistical phenomenen that evolves in timeaccording to probabilistic laws'

Mathematically, a stochastic process is an indexed collection of random variables

Stochastic processes
inference
We base our inference usually on a single observation or realization of the process over some period of time, say [0, T] (a continuous interval of time) or at a sequence of time points {0, 1, 2, . . . T}Inference
specification of a process
To describe a stochastic process fully, we must specify the all finite dimensional distributions, i.e. the joint distribution of of the random variables for any finite set of timesSpecification of a process
specification of a process8
A simpler approach is to only specify the moments—this is sufficient if all the joint distributions are normal

The mean and variance functions are given by

Specification of a process
autocovariance
Because the random variables comprising the process are not independent, we must also specify their covarianceAutocovariance
stationarity11
Inference is most easy, when a process is stationary—its distribution does not change over time

This is strict stationarity

A process is weakly stationary if its mean and autocovariance functions do not change over time

Stationarity
autocorrelation
It is useful to standardize the autocovariance function (acvf)

Consider stationary case only

Use the autocorrelation function (acf)

Autocorrelation
white noise16
This is a purely random process, a sequence of independent and identically distributed random variables

Has constant mean and variance

Also

White noise
random walk19
The random walk is not stationary

First differences are stationary

Random walk
moving average processes21
Start with {Zt} being white noise or purely random, mean zero, s.d. Z

{Xt} is a moving average process of order q (written MA(q)) if for some constants 0, 1, . . . q we have

Usually 0 =1

Moving average processes
moving average processes22
The mean and variance are given by

The process is weakly stationary because the mean is constant and the covariance does not depend on t

Moving average processes
moving average processes25
In order to ensure there is a unique MA process for a given acf, we impose the condition of invertibility

This ensures that when the process is written in series form, the series converges

For the MA(1) process Xt = Zt + Zt - 1, the condition is ||< 1

Moving average processes
moving average processes27
The general condition for invertibility is that all the roots of the equation  lie outside the unit circle (have modulus less than one)Moving average processes
autoregressive processes29
Assume {Zt} is purely random with mean zero and s.d. z

Then the autoregressive process of order p or AR(p) process is

Autoregressive processes
autoregressive processes30
The first order autoregression is

Xt = Xt - 1 + Zt

Provided ||<1 it may be written as an infinite order MA process

Using the backshift operator we have

(1 – B)Xt = Zt

Autoregressive processes
autoregressive processes31
Autoregressive processes
  • From the previous equation we have
autoregressive processes32
Autoregressive processes
  • Then E(Xt) = 0, and if ||<1
autoregressive processes33
Autoregressive processes
  • The AR(p) process can be written as
autoregressive processes34
This is for

for some 1, 2, . . .

This gives Xt as an infinite MA process, so it has mean zero

Autoregressive processes
autoregressive processes35
Conditions are needed to ensure that various series converge, and hence that the variance exists, and the autocovariance can be defined

Essentially these are requirements that the i become small quickly enough, for large i

Autoregressive processes
autoregressive processes36
The i may not be able to be found however.

The alternative is to work with the i

The acf is expressible in terms of the roots i, i=1,2, ...p of the auxiliary equation

Autoregressive processes
autoregressive processes37
Then a necessary and sufficient condition for stationarity is that for every i, |i|<1

An equivalent way of expressing this is that the roots of the equation

must lie outside the unit circle

Autoregressive processes
arma processes
Combine AR and MA processes

An ARMA process of order (p,q) is given by

ARMA processes
arma processes40
An ARMA process can be written in pure MA or pure AR forms, the operators being possibly of infinite order

Usually the mixed form requires fewer parameters

ARMA processes
arima processes
General autoregressive integrated moving average processes are called ARIMA processes

When differenced say d times, the process is an ARMA process

Call the differenced process Wt. Then Wt is an ARMA process and

ARIMA processes
arima processes42
Alternatively specify the process as

This is an ARIMA process of order (p,d,q)

ARIMA processes
arima processes43
The model for Xt is non-stationary because the AR operator on the left hand side has d roots on the unit circle

d is often 1

Random walk is ARIMA(0,1,0)

Can include seasonal terms—see later

ARIMA processes
non zero mean
We have assumed that the mean is zero in the ARIMA models

There are two alternatives

mean correct all the Wt terms in the model

incorporate a constant term in the model

Non-zero mean
topics46
Outline of the approach

Sample autocorrelation & partial autocorrelation

Fitting ARIMA models

Diagnostic checking

Example

Further ideas

Topics
box jenkins approach
The approach is an iterative one involving

model identification

model fitting

model checking

If the model checking reveals that there are problems, the process is repeated

Box-Jenkins approach
models
Models to be fitted are from the ARIMA class of models (or SARIMA class if the data are seasonal)

The major tools in the identification process are the (sample) autocorrelation function and partial autocorrelation function

Models
autocorrelation50
Use the sample autocovariance and sample variance to estimate the autocorrelation

The obvious estimator of the autocovariance is

Autocorrelation
autocovariances
The sample autocovariances are not unbiased estimates of the autocovariances—bias is of order 1/N

Sample autocovariances are correlated, so may display smooth ripples at long lags which are not in the actual autocovariances

Autocovariances
autocovariances52
Can use a different divisor (N-k instead of N) to decrease bias—but may increase mean square error

Can use jacknifing to reduce bias (to order 1/N )—divide the sample in half and estimate using the whole and both halves

Autocovariances

2

autocorrelation53
More difficult to obtain properties of sample autocorrelation

Generally still biased

When process is white noise

E(rk) –1/N

Var( rk) 1/N

Correlations are normal for N large

Autocorrelation
autocorrelation54
Gives a rough test of whether an autocorrelation is non-zero

If |rk|>2/(N) suspect the autocorrelation at that lag is non-zero

Note that when examining many autocorrelations the chance of falsly identifying a non-zero one increases

Consider physical interpretation

Autocorrelation
partial autocorrelation
Broadly speaking the partial autocorrelation is the correlation between Xt and Xt+k with the effect of the intervening variables removed

Sample partial autocorrelations are found from sample autocorrelations by solving a set of equations known as the Yule-Walker equations

Partial autocorrelation
model identification
Plot the autocorrelations and partial autocorrelations for the series

Use these to try and identify an appropriate model

Consider stationary series first

Model identification
stationary series
For a MA(q) process the autocorrelation is zero at lags greater than q, partial autocorrelations tail off in exponential fashion

For an AR(p) process the partial autocorrelation is zero at lags greater than p, autocorrelations tail off in exponential fashion

Stationary series
stationary series58
For mixed ARMA processes, both the acf and pacf will have large values up to q and p respectively, then tail off in an exponential fashion

See graphs in M&W, pp. 136–137

Try fitting a model and examine the residuals is the approach used

Stationary series
non stationary series
The existence of non-stationarity is indicated by an acf which is large at long lags

Induce stationarity by differencing

Differencing once is generally sufficient, twice may be needed

Overdifferencing introduces autocorrelation & should be avoided

Non-stationary series
estimation61
We will always fit the model using Minitab

AR models may be fitted by least squares, or by solving the Yule-Walker equations

MA models require an iterative procedure

ARMA models are like MA models

Estimation
minitab
Minitab uses an iterative least squares approach to fitting ARMA models

Standard errors can be calculated for the parameter estimates so confidence intervals and tests of significance can be carried out

Minitab
diagnostic checking64
Based on residuals

Residuals should be Normally distributed have zero mean, by uncorrelated, and should have minimum variance or dispersion

Diagnostic checking
procedures
Plot residuals against time

Draw histogram

Obtain normal scores plot

Plot acf and pacf of residuals

Plot residuals against fitted values

Note that residuals are not uncorrelated, but are approximately so at long lags

Procedures
portmanteau test
Box and Peirce proposed a statistic which tests the magnitudes of the residual autocorrelations as a group

Their test was to compare Q below with the Chi-Square with K – p – q d.f. when fitting an ARMA(p, q) model

Portmanteau test
portmanteau test68
Box & Ljung discovered that the test was not good unless n was very large

Instead use modified Box-Pierce or Ljung-Box-Pierce statistic—reject model if Q* is too large

Portmanteau test
overfitting
Suppose we think an AR(2) model is appropriate. We fit an AR(3) model.

The estimate of the additional parameter should not be significantly different from zero

The other parameters should not change much

This is an example of overfitting

Overfitting
other identification tools
Chatfield(1979), JRSS A among others has suggested the use of the inverse autocorrelation to assist with identification of a suitable model

Abraham & Ledolter (1984) Biometrika show that although this cuts off after lag p for the AR(p) model it is less effective than the partial autocorrelation for detecting the AR order

Other identification tools
slide72
The Akaike Information Criterion is a function of the maximum likelihood plus twice the number of parameters

The number of parameters in the formula penalizes models with too many parameters

AIC
parsimony
Once principal generally accepted is that models should be parsimonious—having as few parameters as possible

Note that any ARMA model can be represented as a pure AR or pure MA model, but the number of parameters may be infinite

Parsimony
parsimony74
AR models are easier to fit so there is a temptation to fit a less parsimonious AR model when a mixed ARMA model is appropriate

Ledolter & Abraham (1981) Technometrics show that fitting unnecessary extra parameters, or an AR model when a MA model is appropriate, results in loss of forecast accuracy

Parsimony
exponential smoothing
Most exponential smoothing techniques are equivalent to fitting an ARIMA model of some sort

Winters' multiplicative seasonal smoothing has no ARIMA equivalent

Winters' additive seasonal smoothing has a very non-parsimonious ARIMA equivalent

Exponential smoothing
exponential smoothing76
For example simple exponential smoothing is the optimal method of fitting the ARIMA (0, 1, 1) process

Optimality is obtained by taking the smoothing parameter  to be 1 –  when the model is

Exponential smoothing