5 –
This presentation is the property of its rightful owner.
Sponsored Links
1 / 208

5 – Autoregressive Integrated Moving Average (ARIMA) Models PowerPoint PPT Presentation


  • 67 Views
  • Uploaded on
  • Presentation posted in: General

5 – Autoregressive Integrated Moving Average (ARIMA) Models. ARIMA Box-Jenkins Methodology. Example 1/4. The series show an upward trend.

Download Presentation

5 – Autoregressive Integrated Moving Average (ARIMA) Models

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


5 autoregressive integrated moving average arima models

5 – AutoregressiveIntegratedMovingAverage (ARIMA) Models


5 autoregressive integrated moving average arima models

ARIMA

Box-Jenkins Methodology


5 autoregressive integrated moving average arima models

Example 1/4

The series show an upward trend.

The first several autocorrelations are persistently large and trailed off to zero rather slowly  a trend exists and this time series is nonstationary (it does not vary about a fixed level)

Idea: to difference the data to see if we could eliminate the trend and create a stationary series.


5 autoregressive integrated moving average arima models

Example 2/4

First order differences.

A plot of the differenced data appears to vary about a fixed level.

Comparing the autocorrelations with their error limits, the only significant autocorrelation is at lag 1. Similarly, only the lag 1 partial autocorrelation is significant. The PACF appears to cut off after lag 1, indicating AR(1) behavior. The ACF appears to cut off after lag 1, indicating MA(1) behavior  we will try: ARIMA(1,1,0) and ARIMA(0,1,1)

A constant term in each model will be included to allow for the fact that the series of differences appears to vary about a level greater than zero.


5 autoregressive integrated moving average arima models

Example 3/4

ARIMA(1,1,0)

ARIMA(0,1,1)

The LBQ statistics are not significant as indicated by the large p-values for either model.


5 autoregressive integrated moving average arima models

Example 4/4

Finally, there is no significant residual autocorrelation for the ARIMA(1,1,0) model. The results for the ARIMA(0,1,1) are similar.

Therefore, either model is adequate and provide nearly the same one-step-ahead forecasts.


5 autoregressive integrated moving average arima models

Examples

Makridakis

  • ARIMA 7.1

  • ARIMA PIGS

  • ARIMA DJ

  • ARIMA Electricity

  • ARIMA Computers

  • ARIMA Sales Industry

  • ARIMA Pollution

Minitab

  • Employ (Food)

Montgomery

  • EXEMPLO PAG 267

  • EXEMPLO PAG 271

  • EXEMPLO PAG 278

  • EXEMPLO PAG 283


5 autoregressive integrated moving average arima models

ARIMA Basic Model


Basic models

Basic Models

ARIMA (0, 0, 0)―WHITE NOISE

ARIMA (0, 1, 0)―RANDOM WALK

ARIMA (1, 0, 0)―AUTOREGRESSIVE MODEL (order 1)

ARIMA (0, 0, 1)―MOVING AVERAGE MODEL

(order 1)

ARIMA (1, 0, 1)―SIMPLE MIXED MODEL


5 autoregressive integrated moving average arima models

AR MA Example Models

ARIMA (0,0,1)=MA(1)

ARIMA (1,0,0)= AR(1)

ARIMA (0,0,2)= MA(2)

ARIMA (2,0,0)= AR(2)


5 autoregressive integrated moving average arima models

ARMA Example Models

ARIMA(1,01)=ARMA(1,1)

ARIMA(1,01)=ARMA(1,1)


5 autoregressive integrated moving average arima models

Autocorrelation - ACF

Lag ACF T LBQ

1 0,0441176 0,15 0,03

2 -0,0916955 -0,32 0,17

Diferenças são devido a pequenas modificações nas fórmulas de Regressão e Time Series


Partial correlation

Partial Correlation

  • Suppose X, Y and Z are random variables. We define the notion of partial correlation between X and Y adjusting for Z.

  • First consider simple linear regression of X on Z

  • Also the linear regression of Y on Z


Partial correlation1

Partial Correlation

  • Now consider the errors

  • Then the partial correlation between X and Y, adjusting for Z, is


5 autoregressive integrated moving average arima models

Partial Autocorrelation - PACF

Correlations: X*; Y*

Pearson correlation of X* and Y* =0,770

P-Value = 0,000

Partial Autocorrelation Function: X

Lag PACF T

1 0,900575 6,98

2 -0,151346 -1,17

3 0,082229 0,64

Diferenças são devido a pequenas modificações nas fórmulas de Regressão e Time Series


5 autoregressive integrated moving average arima models

Theorectical Behavior for AR(1)

ACF  0

PACF = 0 for lag > 1


5 autoregressive integrated moving average arima models

Theorectical Behavior for AR(2)

ACF  0

PACF = 0 for lag > 2


5 autoregressive integrated moving average arima models

Theorectical Behavior for MA (1)

PACF  0

ACF = 0 for lag > 1


5 autoregressive integrated moving average arima models

Theorectical Behavior for MA(2)

PACF  0

PACF = 0 for lag > 2


5 autoregressive integrated moving average arima models

Theorectical Behavior

Note that:

  • ARMA(p,0) = AR(p)

  • ARMA(0,q) = MA(q)

In this context…

  • “Die out” means “tend to zero gradually”

  • “Cut off” means “disappear” or “is zero”

In practice, the values of p and q each rarely exceed 2.


Review of main characteristics of acf and pacf

Review of Main Characteristics of ACF and PACF


Example 5 1

Example 5.1

The weekly data tend to have short runs and that the data seem to be indeed autocorrelated. Next, we visually inspect the stationarity. Although there might be a slight drop in the mean for the second year (weeks 53-104 ), in general it seems to be safe to assume stationarity.

  • Weekly total number of loan applications

EXEMPLO PAG 267.MPJ


Example 5 11

Example 5.1

1. It cuts off after lag 2 (or maybe even 3), suggesting a MA(2) (or MA(3)) model.

2. It has an (or a mixture ot) exponential decay(s) pattern suggesting an AR(p) model.


Example 5 12

Example 5.1

It cuts off after lag 2. Hence we use the second interpretation of the sample ACF plot and assume that the appropriate model to fit is the AR(2) model.


Example 5 13

Example 5.1

The modified Box-Pierce test suggests that there is no

autocorrelation left in the residuals.


Example 5 14

Example 5.1


Example 5 15

Example 5.1


Example 5 16

Example 5.1


Example 5 2

Example 5.2

Exemplo: Página 271

  • Dow Jones Index

The process shows signs of nonstationarity with changing mean

and possibly variance.


Example 5 21

Example 5.2

The slowly decreasing sample ACF and sample PACF with significant value at lag 1, which is close to 1 confirm that indeed the process can be deemed nonstationary.


Example 5 22

Example 5.2

One might argue that the significant sample PACF value at lag I suggests that the AR( I) model might also fit the data well.

We will consider this interpretation first and fit an AR( I) model to the Dow Jones Index data.


Example 5 23

Example 5.2

The modified Box-Pierce test suggests that there is no autocorrelation left in the

residuals. This is also confirmed by the sample ACF and PACF plots of the residuals


Example 5 24

Example 5.2


Example 5 25

Example 5.2

The only concern in the residual plots in is in the changing variance observed in the time series plot of the residuals.


Example 5 26

Example 5.2


Example 5 27

Example 5.2


Example 5 28

Example 5.2


Example 5 29

Example 5.2


Example 5 3

Example 5.3

Predictionwith AR(2)

Exemplo pag 278


Example 5 31

Example 5.3


Example 5 32

Example 5.3


Example 5 5

Example 5.5

Exemplo: Página 283

The data obviously exhibit some

seasonality and upward linear trend. c

  • U.S. Clothing Sales Data


Example 5 51

Example 5.5

The sample ACF and PACF indicate a monthly seasonality,

s = 12, as ACF values at lags 12, 24, 36 are significant and slowly decreasing


Example 5 52

Example 5.5

The sample ACF and PACF indicate a monthly seasonality,

s = 12, as ACF values at lags 12, 24, 36 are significant and slowly decreasing


Example 5 53

Example 5.5

There is a significant PACF value at lag 12 that is close to 1. Moreover, the slowly decreasing ACF in general also indicates a

nonstationarity that can be remedied by taking the first difference. Hence we would now consider


Example 5 54

Example 5.5


Example 5 55

Example 5.5

There is a significant PACF value at lag 12 that is close to 1. Moreover, the slowly decreasing ACF in general also indicates a

nonstationarity that can be remedied by taking the first difference. Hence we would now consider


Example 5 56

Example 5.5

Figure shows that first difference together with seasonal differencing helps in terms of stationarity and eliminating the seasonality


Example 5 57

Example 5.5


Example 5 58

Example 5.5

The sample ACF with a significant value at lag 1 and the sample PACF with exponentially decaying values at the first 8 lags suggest that a nonseasonal MA( I)

model should be used.


Example 5 59

Example 5.5

The interpretation of the remaining seasonality is a bit more difficult. For that we should focus on the sample ACF and PACF values at lags 12. 24, 36, and so on. The sample ACF at lag 12 seems to be significant and the sample PACF at lags 12, 24, 36 (albeit not significant) seems to be alternating in sign. That suggests that a seasonal MA(1) model can be used as well. Hence an ARIMA (0, 1, 1) x (0, 1, 1) 12 model is used to model the data, yt


Example 5 510

Example 5.5


Example 5 511

Example 5.5

Both MA( 1) and seasonal MA( 1) coefficient estimates are significant.

As we can see from the sample ACF and PACF plots, while there are still some small significant values, as indicated by the modified Box-Pierce statistic, most of the autocorrelation is now modeled out.


Example 5 512

Example 5.5

As we can see from the sample ACF and PACF plots, while there are still some small significant values, as indicated by the modified Box-Pierce statistic, most of the autocorrelation is now modeled out.


Example 5 513

Example 5.5


Example 5 514

Example 5.5


5 autoregressive integrated moving average arima models

Introduction

Exponential smoothing. The general assumption for these models was that any time series data can be represented as the sum of two distinct components: deterministic and stochastic (random). The former (deterministic) is modeled as a function of time whereas for the latter (stochastic) we assumed that some random noise that is added on the deterministic signal generates the stochastic behavior of the time series.

One very important assumption is that the random noise is generated through independent shocks to the process.

In practice, however, this assumption is often violated. That is, usually successive observations show serial dependence. Under these circumstances, forecasting methods based on exponential smoothing may be inefficient and sometimes inappropriate because they do not take advantage of the serial dependence in the observations in the most effective way.

To formally incorporate this dependent structure, we will explore a general class of models called autoregressive integrated moving average models or ARIMA models (also known as Box-Jenkins models).


Linear models for stationary time series

Linear Models for Stationary Time Series

  • A linear filter is defined as

Um conceito de Processamento de Sinais

is said to be


Stationarity

Stationarity


Some examples

Some Examples


Stationary time series

Stationary Time Series

  • Many time series do not exhibit a stationary behavior

  • The stationarity is in fact a rarity in real life

  • However it provides a foundation to build upon since (as we will see later on) if the time series in not stationary, its first difference (yt-yt-1) will often be


Linear filter

Linear Filter


If input is white noise

If Input is White Noise


Using the backshift operator

Using the Backshift Operator


Wold s decomposition theorem

Wold’s Decomposition Theorem

  • Any nondeterministic weakly stationary time series can be written as an infinite sum of weighted random shocks (disturbances)

where


How useful is this

How useful is this?

How can we come up with “infinitely” many terms?

Well, not so much!!!


Maybe we should consider some special cases

Maybe we should consider some special cases:


Finite order moving average processes ma q

Finite Order Moving Average Processes (Ma(q))


Some properties

Some Properties

  • Expected Value

  • Variance


Some properties1

Some Properties

  • Autocovariance Function

  • Autocorrelation Function (ACF)


Autocorrelation function of ma q

Autocorrelation Function of MA(q)

  • ACF of Ma(q) ”cuts off” after lag q

  • This is very useful in the identification of an MA(q) process


5 autoregressive integrated moving average arima models

Example

Employ.mtw


5 autoregressive integrated moving average arima models

Diferences


5 autoregressive integrated moving average arima models

Autocorrelation

The graphs for the autocorrelation function (ACF) of the ARIMA residuals include lines representing two standard errors to either side of zero. Values that extend beyond two standard errors are statistically significant at approximately a = 0.05, and show evidence that the model does not explain thel autocorrelation in the data.

Because you did not specify the lag length, autocorrelation uses the default length of n / 4 for a series with less than or equal to 240 observations. Minitab generates an autocorrelation function (ACF) with approximate a = 0.05 critical bands for the hypothesis that the correlations are equal to zero.


5 autoregressive integrated moving average arima models

Autocorrelation

The ACF for these data shows large positive, significant spikes at lags 1 and 2 with subsequent positive autocorrelations that do not die off quickly. This pattern is typical of an autoregressive process.


5 autoregressive integrated moving average arima models

Ljung-Box q statistic

Use to test whether a series of observations over time are random and independent. If observations are not independent, one observation may be correlated with another observation k time units later, a relationship called autocorrelation. Autocorrelation can impair the accuracy of a time-based predictive model, such as time series plot, and lead to misinterpretation of the data.

For example, an electronics company tracks monthly sales of batteries for five years. They want to use the data to develop a  time series model to help forecast future sales. However, monthly sales may be affected by seasonal trends. For example, every year a rise in sales occurs when people buy batteries for Christmas toys. Thus a monthly sales observation in one year could be correlated with a monthly sales observations 12 months later (a lag of 12).

Before choosing their time series model, they can evaluate autocorrelation for the monthly differences in sales. The Ljung-Box Q (LBQ) statistic tests the null hypothesis that autocorrelations up to lag k equal zero (i.e., the data values are random and independent up to a certain number of lags--in this case 12). If the LBQ is greater than a specified critical value, autocorrelations for one or more lags may be  significantly different from zero, suggesting the values are not random and independent over time.

LBQ is also used to evaluate assumptions after fitting a time series model, such as ARIMA, to ensure that the residuals are independent.

The Ljung-Box is a Portmanteau test and is a modified version of the Box-Pierce chi-square statistic.


5 autoregressive integrated moving average arima models

You can use the Ljung-Box Q (LBQ) statistic to test the null hypothesis that the autocorrelations for all lags up to lag k equal zero.

Let's test that all autocorrelations up to a lag of 6 are zero. The LBQ statistic is 56.03.

Ho: Autocorrelation (lag<6) = 0

Variable CumProb is created


5 autoregressive integrated moving average arima models

In this example, the p-value is 0.000000, which means the p-value is less than 0.0000005. The very small p-value implies that one or more of the autocorrelations up to lag 6 can be judged as significantly different from zero at any reasonable a level.


5 autoregressive integrated moving average arima models

Partial autocorrelation computes and plots the partial autocorrelations of a time series. Partial autocorrelations, like autocorrelations, are correlations between sets of ordered data pairs of a time series. As with partial correlations in the regression case, partial autocorrelations measure the strength of relationship with other terms being accounted for. The partial autocorrelation at a lag of k is the correlation between residuals at time t from an autoregressive model and observations at lag k with terms for all intervening lags present in the autoregressive model. The plot of partial autocorrelations is called the partial autocorrelation function or PACF. View the PACF to guide your choice of terms to include in an ARIMA model.


5 autoregressive integrated moving average arima models

You obtain a partial autocorrelation function (PACF) of the food industry employment data, after taking a difference of lag 12, in order to help determine a likely ARIMA model.


5 autoregressive integrated moving average arima models

Minitab generates a partial autocorrelation function with critical bands at approximately a = 0.05 for the hypothesis that the correlations are equal to zero.

In the food data example, there is a single large spike of 0.7 at lag 1, which is typical of an autoregressive process of order one. There is also a significant spike at lag 9, but you have no evidence of a nonrandom process occurring there.


Sample acf

Sample ACF

  • Will not be equal to zero after lag q for an MA(q)

  • But it will be small

  • For the same size of N, this can be tested using the limits:


First order moving average process ma 1

First-Order Moving Average Process MA(1)

for which autocovariance and autocorrelation functions are given as


Some examples1

Some Examples

Note, the behavior

of sample ACF


Second order moving average process ma 2

Second-Order Moving Average Process MA(2)

for which autocovariance and autocorrelation functions are given as


An example

An Example


Finite order autoregressive processes ar p

Finite Order Autoregressive Processes (AR(p))

  • MA(q) processes take into account disturbances up to q lags in the past

  • What if all past disturbances have some lingering effects? Back to square one?

  • We may be able to come up with some special cases though


A very special case

A very special case

  • What if we let


Decomposition

Decomposition

and


Combining the two equations

Combining the two equations

This is an AR(1) model


First order autoregressive process ar 1

First-Order Autoregressive Process (AR(1))


Properties

Properties

  • Expected Value

  • Autocovariance Function

  • Autocorrelation Function


Some examples2

Some Examples


Second order autoregressive process ar 2

Second-Order Autoregressive Process (AR(2))


Conditions for stationarity

Conditions for Stationarity


Ar 2 is stationary if

AR(2) is stationary if …


Ar 2 is stationary if1

AR(2) is stationary if …


5 autoregressive integrated moving average arima models

AR(2)

  • Hence {yj} satisfy the 2nd order linear difference equation. So the yi can be expressed as the solution to this equation in terms of the 2 roots m1 and m2 of the associated polynomial

  • If the roots m1 and m2 satisfy


5 autoregressive integrated moving average arima models

AR(2) is stationary if the roots m1 and m2 of are both less than one in absolute value


Acf of a stationary ar 2

ACF of a stationary AR(2)


Acf of a stationary ar 21

ACF of a stationary AR(2)

Yule-Walker

Equations


Acf of a stationary ar 22

ACF of a stationary AR(2)

  • Hence ACF satisfies the 2nd order linear difference equation. So the r(k)can be expressed as the solution to this equation in terms of the 2 roots m1 and m2 of the associated polynomial


Acf of a stationary ar 23

ACF of a stationary AR(2)


Some examples3

Some Examples


5 autoregressive integrated moving average arima models

AR(p)


Ar p is stationary

AR(p) is Stationary

  • If the roots of

    are less than one in absolute value.


Infinite ma representation

Infinite MA representation


Expected value of an ar p

Expected Value of an AR(p)


Autocovariance function of an ar p

Autocovariance Function of an AR(p)


Autocorrelation function of an ar p

Autocorrelation Function of an AR(p)


Acf of ar p

ACF of AR(p)

In general ACF of AR(p) can be a mixture

of exponential decay and damped sinusoidal behavior depending on the solution to the corresponding Yule-Walker equations.


Acf of ar p1

ACF of AR(p)


Acf for ar p and ma q

ACF for AR(p) and MA(q)

  • ACF of MA(q) “cuts off” after q

  • ACF of AR(p) can be a mixture of exponential decay and damped sinusoidal


So how are we going to determine p in the ar p model

So how are we going to determine p in the AR(p) model?


Partial correlation2

Partial Correlation

  • Suppose X, Y and Z are random variables. We define the notion of partial correlation between X and Y adjusting for Z.

  • First consider simple linear regression of X on Z

  • Also the linear regression of Y on Z


Partial correlation3

Partial Correlation

  • Now consider the errors

  • Then the partial correlation between X and Y, adjusting for Z, is


Partial autocorrelation function pacf

Partial Autocorrelation Function (PACF)


Partial autocorrelation function pacf1

Partial Autocorrelation Function (PACF)


Partial autocorrelation function pacf2

Partial Autocorrelation Function (PACF)


Partial autocorrelation function pacf3

Partial Autocorrelation Function (PACF)


Sample partial autocorrelation function

Sample Partial Autocorrelation Function


Some examples4

Some Examples


5 autoregressive integrated moving average arima models

PACF

  • For an AR(p) process, PACF cuts off after lag p.

  • For an MA(q) process, PACF has an exponential decay and/or a damped sinusoid form


Invertibility of a ma process

Invertibility of a MA Process


Invertibility of a ma process1

Invertibility of a MA Process


Invertibility of a ma process2

Invertibility of a MA Process

We have


5 autoregressive integrated moving average arima models

The ACF and PACF do have very distinct and indicative properties for MA and AR models. Therefore in model identification it is strongly recommended to use both the sample ACF and PACF simultaneously


Mixed autoregressive moving average arma p q process

Mixed Autoregressive-Moving Average (ARMA(p,q)) Process


Stationarity of arma p q

Stationarity of ARMA(p,q)


Invertibility of arma p q

Invertibility of ARMA(p,q)


Acf and pacf of an arma p q

ACF and PACF of an ARMA(p,q)

  • Both ACF and PACF of an ARMA(p,q) can be a mixture of exponential decay and damped sinusoids depending on the roots of the AR operator.


Arma models

ARMA Models

  • For ARMA models, except for possible special cases, neither ACF nor PACF has distinctive features that would allow “easy identification”

  • For this reason, there have been many additional sample functions considered to help with identification problem:

    • Extended sample ACF (ESACF)

    • Generalized sample PACF (GPACF)

    • Inverse ACF

    • Use of “canonical correlations”


Some examples5

Some Examples


Review of main characteristics of acf and pacf1

Review of Main Characteristics of ACF and PACF


Review of main characteristics of sample acf and pacf

Review of Main Characteristics of Sample ACF and PACF


Some examples6

Some Examples


Some examples7

Some Examples


Some examples8

Some Examples


Some examples9

Some Examples


Some examples10

Some Examples


Arima models

ARIMA Models

  • Process {yt} is ARIMA(p,d,q), if the dth order differences, wt=(1-B)dyt, form a stationary ARMA(p,q) process:

  • Thus {yt} satisfies


Some examples11

Some Examples


Some examples12

Some Examples


Model building

Model Building

  • Given T observations from a process, want to obtain a model that adequately represents the main features of the time series data. Model can be used for purposes of forecasting, control, …


3 stage procedure

3-Stage Procedure

  • STAGE 1: Model Specification or Identification

    • Consider issue of nonstationarity vs. stationarity of series. Use procedures such as differencing to obtain a stationary series; say wt=(1-B)dyt

      • Examine sample ACF and PACF of wt and use features of these functions to identify an appropriate ARMA model. The specification is “tentative”


Review of main characteristics of acf and pacf2

Review of Main Characteristics of ACF and PACF


Review of main characteristics of sample acf and pacf1

Review of Main Characteristics of Sample ACF and PACF


Arma models1

ARMA Models

  • For ARMA models, except for possible special cases, neither ACF nor PACF has distinctive features that would allow “easy identification”

  • For this reason, there have been many additional sample functions considered to help with identification problem:

    • Extended sample ACF (ESACF)

    • Generalized sample PACF (GPACF)

    • Inverse ACF

    • Use of “canonical correlations”


3 stage procedure1

3-Stage Procedure

  • STAGE 2: Estimation of Parameters in Tentatively Specified Model

    • Method of moments

    • Least Squares

    • Maximum Likelihood


3 stage procedure2

3-Stage Procedure

  • STAGE 3: Model Checking

    • Based on examining features of residuals


3 stage procedure3

3-Stage Procedure

  • STAGE 3: If the specified model is appropriate order p, q; then we expect the residuals behave similar to the “true” white noise et.


Example 5 17

Example 5.1

  • Weekly total number of loan applications


Example 5 18

Example 5.1


Example 5 19

Example 5.1


Example 5 110

Example 5.1


Example 5 111

Example 5.1


Example 5 112

Example 5.1


Example 5 113

Example 5.1


Example 5 210

Example 5.2

  • Dow Jones Index


Example 5 211

Example 5.2


Example 5 212

Example 5.2


Example 5 213

Example 5.2


Example 5 214

Example 5.2


Example 5 215

Example 5.2


Example 5 216

Example 5.2


Example 5 217

Example 5.2


Example 5 218

Example 5.2


Example 5 219

Example 5.2


Forecasting arima processes

Forecasting ARIMA Processes


Forecasting arima processes1

Forecasting ARIMA Processes


The best forecast

The “best” forecast


Forecast error

Forecast Error


Prediction intervals

Prediction Intervals


Two issues

Two Issues


Illustration using arima 1 1 1

Illustration Using ARIMA(1,1,1)

  • ARIMA(1,1,1) process is given as

  • Two commonly used approaches


Approach 1

Approach 1


Approach 2

Approach 2


Example 5 33

Example 5.3


Seasonal processes

Seasonal Processes


Seasonal processes1

Seasonal Processes


Seasonal processes2

Seasonal Processes


Seasonal processes3

Seasonal Processes


Example 5 4

Example 5.4


Example 5 515

Example 5.5

  • U.S. Clothing Sales Data


Example 5 516

Example 5.5


Example 5 517

Example 5.5


Example 5 518

Example 5.5


Example 5 519

Example 5.5


Example 5 520

Example 5.5


Example 5 521

Example 5.5


Example 5 522

Example 5.5


Example 5 523

Example 5.5


Example 5 524

Example 5.5


5 autoregressive integrated moving average arima models

Use ARIMA to model time series behavior and to generate forecasts. ARIMA fits a Box-Jenkins ARIMA model to a time series. ARIMA stands for Autoregressive Integrated Moving Average with each term representing steps taken in the model construction until only random noise remains. ARIMA modeling differs from the other time series methods in the fact that ARIMA modeling uses correlational techniques. ARIMA can be used to model patterns that may not be visible in plotted data.


5 autoregressive integrated moving average arima models

The ACF and PACF of the food employment data suggest an autoregressive model of order 1, or AR(1), after taking a difference of order 12. You fit that model here, examine diagnostic plots, and examine the goodness of fit. To take a seasonal difference of order 12, you specify the seasonal period to be 12, and the order of the difference to be 1.


5 autoregressive integrated moving average arima models

1    Model is specified by the usual notation (pdq) x (PDQ) S:

(pdq) is for a nonseasonal model; (PDQ) for a seasonal, and S is the seasonality.

2    At least one of the p, P, q, or Q parameters must be non-zero, and none may exceed five.

3    The maximum number of parameters you can estimate is ten.

4    At least three data points must remain after differencing. That is, S * D + d + 2 must be less than the number of points, where S is the length of a season.

5    The maximum "back order" for the model is 100. In practice, this condition is always satisfied if S * D + d + p + P + q + Q is at most 100.

6    The ARIMA model normally includes a constant term only if there is no differencing (that is, d = D = 0).

7    Missing observations are only allowed at the beginning or the end of a series, not in the middle.

8    The seasonal component of this model is multiplicative, and thus is appropriate when the amount of cyclical variation is proportional to the mean.


5 autoregressive integrated moving average arima models

The ARIMA model converged after nine iterations. The AR(1) parameter had a t-value of 7.42. As a rule of thumb, you can consider values over two as indicating that the associated parameter can be judged as significantly different from zero. The MSE (1.1095) can be used to compare fits of different ARIMA models.

The Ljung-Box statistics give nonsignificant p-values , indicating that the residuals appeared to uncorrelated. The ACF and PACF of the residuals corroborate this. You assume that the spikes in the ACF and PACF at lag 9 are the result of random events

The coefficients are estimated using an iterative algorithm that calculates least squares estimates. At each iteration, the back forecasts are computed and SSE is calculated.

Back forecasts are calculated using the specified model and the current iteration's parameter estimates


5 autoregressive integrated moving average arima models

Box and Jenkins [2] present an interactive approach for fitting ARIMA models to time series. This iterative approach involves identifying the model, estimating the parameters, checking model adequacy, and forecasting, if desired. The model identification step generally requires judgment from the analyst.

1    First, decide if the data are stationary. That is, do the data possess constant mean and variance .

·    Examine a time series plot to see if a transformation is required to give constant variance.

·    Examine the ACF to see if large autocorrelations do not die out, indicating that differencing may be required to give a constant mean.

A seasonal pattern that repeats every kth time interval suggests taking the kth difference to remove a portion of the pattern. Most series should not require more than two difference operations or orders. Be careful not to overdifference. If spikes in the ACF die out rapidly, there is no need for further differencing. A sign of an overdifferenced series is the first autocorrelation close to -0.5 and small values elsewhere.


5 autoregressive integrated moving average arima models

2    Next, examine the ACF and PACF of your stationary data in order to identify what autoregressive or moving average models terms are suggested.

·    An ACF with large spikes at initial lags that decay to zero or a PACF with a large spike at the first and possibly at the second lag indicates an autoregressive process.

·    An ACF with a large spike at the first and possibly at the second lag and a PACF with large spikes at initial lags that decay to zero indicates a moving average process.

·    The ACF and the PACF both exhibiting large spikes that gradually die out indicates that both autoregressive and moving averages processes are present.

For most data, no more than two autoregressive parameters or two moving average parameters are required in ARIMA models.


5 autoregressive integrated moving average arima models

3    Once you have identified one or more likely models, you are ready to use the ARIMA procedure.

·    Fit the likely models and examine the significance of parameters and select one model that gives the best fit.

·    Check that the ACF and PACF of residuals indicate a random process, signified when there are no large spikes. You can easily obtain an ACF and a PACF of residual using ARIMA's Graphs subdialog box. If large spikes remain, consider changing the model.

·    You may perform several iterations in finding the best model. When you are satisfied with the fit, go ahead and make forecasts.

The ARIMA algorithm will perform up to 25 iterations to fit a given model. If the solution does not converge, store the estimated parameters and use them as starting values for a second fit. You can store the estimated parameters and use them as starting values for a subsequent fit as often as necessary.


5 autoregressive integrated moving average arima models

The graphs for the ACF and PACF of the ARIMA residuals include lines representing two standard errors to either side of zero. Values that extend beyond two standard errors are statistically significant at approximately a = 0.05, and show evidence that the model has not explained all autocorrelation in the data.


5 autoregressive integrated moving average arima models

The AR(1) model appears to fit well so you use it to forecast employment.


5 autoregressive integrated moving average arima models

  • The ARIMA algorithm is based on the fitting routine in the TSERIES package written by Professor William Q. Meeker, Jr., of Iowa State University.

  • W.Q.Meeker, Jr. (1977). "TSERIES-A User-orientedComputerProgram for Identifying, FittingandForecasting ARIMA Time Series Models," ASA 1977 ProceedingsoftheStatistical Computing Section.

  • W.Q.Meeker, Jr. (1977). TSERIES User's Manual, StatisticalLaboratory, IowaStateUniversity.


  • Login