time series analysis and forecasting i
Download
Skip this Video
Download Presentation
Time Series Analysis and Forecasting I

Loading in 2 Seconds...

play fullscreen
1 / 59

Time Series Analysis and Forecasting I - PowerPoint PPT Presentation


  • 121 Views
  • Uploaded on

Time Series Analysis and Forecasting I. Introduction. A time series is a set of observations generated sequentially in time Continuous vs. discrete time series

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Time Series Analysis and Forecasting I' - calvin-briggs


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
introduction
Introduction
  • A time series is a set of observations generated sequentially in time
  • Continuous vs. discrete time series
  • The observations from a discrete time series, made at some fixed interval h, at times 1, 2,…, N may be denoted by z(1), z(2),…, z(N)
introduction cont
Introduction (cont.)
  • Discrete time series may arise in two ways:
    • 1- By sampling a continuous time series
    • 2- By accumulating a variable over a period of time
  • Characteristics of time series
    • Time periods are of equal length
    • No missing values
areas of application
Areas of application
  • Forecasting
  • Determination of a transfer function of a system
  • Design of simple feed-forward and feedback control schemes
forecasting
Applications

Economic and business planning

Inventory and production control

Control and optimization of industrial processes

Lead time of the forecasts

is the period over which forecasts are needed

Degree of sophistication

Simple ideas

Moving averages

Simple regression techniques

Complex statistical concepts

Box-Jenkins methodology

Forecasting
approaches to forecasting cont
Self-projecting approach

Advantages

Quickly and easily applied

A minimum of data is required

Reasonably short-to medium-term forecasts

They provide a basis by which forecasts developed through other models can be measured against

Disadvantages

Not useful for forecasting into the far future

Do not take into account external factors

Cause-and-effect approach

Advantages

Bring more information

More accurate medium-to long-term forecasts

Disadvantages

Forecasts of the explanatory time series are required

Approaches to forecasting (cont.)
some traditional self projecting models
Some traditional self-projecting models
  • Overall trend models
    • The trend could be linear, exponential, parabolic, etc.
    • A linear Trend has the form
      • Trendt = A + Bt
    • Short-term changes are difficult to track
  • Smoothing models
    • Respond to the most recent behavior of the series
    • Employ the idea of weighted averages
    • They range in the degree of sophistication
    • The simple exponential smoothing method:
some traditional self projecting models cont
Some traditional self-projecting models (cont.)
  • Seasonal models
    • Very common
    • Most seasonal time series also contain long- and short-term trend patterns
  • Decomposition models
    • The series is decomposed into its separate patterns
    • Each pattern is modeled separately
drawbacks of the use of traditional models
Drawbacks of the use of traditional models
  • There is no systematic approach for the identification and selection of an appropriate model, and therefore, the identification process is mainly trial-and-error
  • There is difficulty in verifying the validity of the model
    • Most traditional methods were developed from intuitive and practical considerations rather than from a statistical foundation
  • Too narrow to deal efficiently with all time series
arima models
ARIMA models
  • Autoregressive Integrated Moving-average
  • Can represent a wide range of time series
  • A “stochastic” modeling approach that can be used to calculate the probability of a future value lying between two specified limits
arima models cont
ARIMA models (Cont.)
  • In the 1960’s Box and Jenkins recognized the importance of these models in the area of economic forecasting
  • “Time series analysis - forecasting and control”
    • George E. P. Box Gwilym M. Jenkins
    • 1st edition was in 1976
  • Often called The Box-Jenkins approach
transfer function modeling
Transfer function modeling
  • Yt = (B)Xt where

(B) = 0 + 1B + 2B2 + …..

  • B is the backshift operator

BmXt = Xt - m

transfer function modeling cont
The study of process dynamics can achieve:

Better control

Improved design

Methods for estimating transfer function models

Classical methods

Based on deterministic perturbations

Uncontrollable disturbances (“noise”) are not accounted for, and hence, these methods have not always been successful

Statistical methods

Make allowance for “noise”

The Box-Jenkins methodology

Transfer function modeling (cont.)
process control
Feed-forward control

Feedback control

Deviation from target output

P

P

N

N

t

t

Deviation from

target output

-

+

1

f

1

-

-

+

d

w

1

b

1

f

1

L

(

B

)

L

(

B

)

B

(

B

)

(

B

)

B

L

(

B

)

L

(

B

)

B

1

2

1

2

Compensating

Compen

sating

variable X

variable X

t+

t+

Control equation

Control equation

z

t

Process control
process control cont1
Process control (cont.)
  • The Box-Jenkins approach to control is to typify the disturbance by a suitable time series or stochastic model and the inertial characteristics of the system by a suitable transfer function model
  • The “Control equation”, allows the action which should be taken at any given time to be calculated given the present and previous states of the system
  • Various ways corresponding to various levels of technological sophistication can be used to execute a “control action” called for by the control equation
the box jenkins model building process
The Box-Jenkins model building process

Model identification

Model estimation

Is model adequate ?

No

Modify model

Yes

Forecasts

the box jenkins model building process cont
The Box-Jenkins model building process (cont.)
  • Model identification
      • Autocorrelations
      • Partial-autocorrelations
  • Model estimation
    • The objective is to minimize the sum of squares of errors
  • Model validation
    • Certain diagnostics are used to check the validity of the model
  • Model forecasting
    • The estimated model is used to generate forecasts and confidence limits of the forecasts
important fundamentals
Important Fundamentals
  • A Normal process
  • Stationarity
  • Regular differencing
  • Autocorrelations (ACs)
  • The white noise process
  • The linear filter model
  • Invertibility
a normal process a gaussian process
A Normal process (A Gaussian process)
  • The Box-Jenkins methodology analyze a time series as a realization of a stochastic process.
    • The observation zt at a given time t can be regarded as a realization ofa random variable zt with probability density function p(zt)
    • The observations at any two times t1 and t2 may be regarded as realizations of two random variables zt1, zt2 and with joint probability density function p(zt1, zt2)
    • If the probability distribution associated with any set of times is multivariate Normal distribution, the process is called a normal or Gaussian process
stationary stochastic processes
Stationary stochastic processes
  • In order to model a time series with the Box-Jenkins approach, the series has to be stationary
  • In practical terms, the series is stationary if tends to wonder more or less uniformly about some fixed level
  • In statistical terms, a stationary process is assumed to be in a particular state of statistical equilibrium, i.e., p(zt) is the same for all t
stationary stochastic processes cont
Stationary stochastic processes (cont.)
  • the process is called “strictly stationary”
    • if the joint probability distribution of any m observations made at times t1, t2, …, tm is the same as that associated with m observations made at times t1 + k, t2 + k, …, tm + k
  • When m = 1, the stationarity assumption implies that the probability distribution p(zt) is the same for all times t
stationary stochastic processes cont1
Stationary stochastic processes (cont.)
  • In particular, if zt is a stationary process, then the first difference zt = zt - zt-1and higher differences dztare stationary
  • Most time series are nonstationary
achieving stationarity
Achieving stationarity
  • Regular differencing (RD)

(1st order) zt = (1 – B)zt = zt – zt-1

(2nd order) 2zt = (1 – B)2zt = zt – 2zt-1 + zt-2

“B” is the backward shift operator

  • It is unlikely that more than two regular differencing would ever be needed
  • Sometimes regular differencing by itself is not sufficient and prior transformation is also needed
some nonstationary series cont1
Some nonstationary series (cont.)

How can we determine the number of regular differencing ?

autocorrelations acs
Autocorrelations (ACs)
  • Autocorrelations are statistical measures that indicate how a time series is related to itself over time
  • The autocorrelation at lag 1 is the correlation between the original series zt and the same series moved forward one period (represented as zt-1)
autocorrelations cont
Autocorrelations (cont.)
  • The theoretical autocorrelation function
  • The sample autocorrelation
autocorrelations cont1
Autocorrelations (cont.)
  • A graph of the correlation values is called a “correlogram”
  • In practice, to obtain a useful estimate of the autocorrelation function, at least 50 observations are needed
  • The estimated autocorrelations rk would be calculated up to lag no larger than N/4
the white noise process
The white noise process
  • The Box-Jenkins models are based on the idea that a time series can be usefully regarded as generated from (driven by) a series of uncorrelated independent “shocks” at
  • Such a sequence at, at-1, at-2,… is called a “white noise process”
the linear filter model

y

(

B

)

White noise

z

t

Linear filter

a

t

The linear filter model
  • A “linear filter” is a model that transform the white noise process at to the process that generated the time series zt
the linear filter model cont
The linear filter model (cont.)
  • (B) is the “transfer function” of the filter
the linear filter model cont1
The linear filter model (cont.)
  • The linear filter can be put in another form
  • This form can be written
stationarity and invertibility conditions for a linear filter
For a linear process to be stationary,

If the current observation zt depends on past observations with weights which decrease as we go back in time, the series is called invertible

For a linear process to be invertible,

Stationarity and invertibility conditions for a linear filter
model building blocks
Model building blocks
  • Autoregressive (AR) models
  • Moving-average (MA) models
  • Mixed ARMA models
  • Non stationary models (ARIMA models)
  • The mean parameter
  • The trend parameter
autoregressive ar models
Autoregressive (AR) models
  • An autoregressive model of order “p”
  • The autoregressive process can be thought of as the output from a linear filter with a transfer function -1(B), when the input is white noise at
  • The equation (B) = 0 is called the “characteristic equation”
moving average ma models
Moving-average (MA) models
  • A moving-average model of order “q”
  • The moving-average process can be thought of as the output from a linear filter with a transfer function (B), when the input is white noise at
  • The equation (B) = 0 is called the “characteristic equation”
mixed ar and ma arma models
Mixed AR and MA (ARMA) models
  • A moving-average process of 1st order can be written as
  • Hence, if the process were really MA(1), we would obtain a non parsimonious representation in terms of an autoregressive model
mixed ar and ma arma models cont
Mixed AR and MA (ARMA) models (cont.)
  • In order to obtain a parsimonious model, sometimes it will be necessary to include both AR and MA terms in the model
  • An ARMA(p, q) model
  • The ARMA process can be thought of as the output from a linear filter with a transfer function (B)/(B), when the input is white noise at
the box jenkins model building process1
The Box-Jenkins model building process
  • Model identification
    • Autocorrelations
    • Partial-autocorrelations
  • Model estimation
  • Model validation
    • Certain diagnostics are used to check the validity of the model
  • Model forecasting
partial autocorrelations pacs
Partial-autocorrelations (PACs)
  • Partial-autocorrelations are another set of statistical measures are used to identify time series models
  • PAC is Similar to AC, except that when calculating it, the ACs with all the elements within the lag are partialled out (Box & Jenkins, 1976)
partial autocorrelations cont
Partial-autocorrelations (cont.)
  • PACs can be calculated from the values of the ACs where each PAC is obtained from a different set of linear equations that describe a pure autoregressive model of an order that is equal to the value of the lag of the partial-autocorrelation computed
  • PAC at lag k is denoted by kk
    • The double notation kk is to emphasize that kk is the autoregressive parameter k of the autoregressive model of order k
model identification

Stationarity and invertibility conditions

Theoretical ACs and PACs

Model identification
  • The sample ACs and PACs are computed for the series and compared to theoretical autocorrelation and partial-autocorrelation functions for candidate models investigated
stationarity requirements for ar 1 model
Stationarity requirements for AR(1) model
  • For an AR(1) to be stationary:
    • -1 < 1 < 1

i.e., the roots of the characteristic equation 1 - 1B = 0 lie outside the unit circle

  • For an AR(1) it can be shown that:
    • k = 1 k – 1 which with 0 = 1 has the solution

k = 1k k > 0

i.e., for a stationary AR(1) model, the theoretical autocorrelation function decays exponentially to zero, however, the theoretical partial-autocorrelation function has a cut off after the 1st lag

invertibility requirements for a ma 1 model
Invertibility requirements for a MA(1) model
  • For a MA(1) to be invertible:
    • -1 < 1 < 1

i.e., the roots of the characteristic equation 1 -  1B = 0 lie outside the unit circle

  • For a MA(1) it can be shown that:
  • i.e., for an invertible MA(1) model, the theoretical autocorrelation function has a cut off after the 1st lag, however, the theoretical partial-autocorrelation function decays exponentially to zero
higher order models
Higher order models
  • For an AR model of order p > 1:
    • The autocorrelation function consists of a mixture of damped exponentials and damped sine waves
    • The partial-autocorrelation function has a cut off after the p lag
  • For a MA models of order q > 1:
    • The autocorrelation function has a cut off after the q lag
    • The partial-autocorrelation function consists of a mixture of damped exponentials and damped sine waves
ad