1 / 70

STAT 497 LECTURE NOTES 7

STAT 497 LECTURE NOTES 7. FORECASTING. FORECASTING. One of the most important objectives in time series analysis is to forecast its future values. It is the primary objective of modeling. ESTIMATION (tahmin) the value of an estimator for a parameter.

cargan
Download Presentation

STAT 497 LECTURE NOTES 7

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. STAT 497LECTURE NOTES 7 FORECASTING

  2. FORECASTING • One of the most important objectives in time series analysis is to forecast its future values. It is the primary objective of modeling. • ESTIMATION (tahmin)the value of an estimator for a parameter. • PREDICTION (kestirim)the value of a r.v. when we use the estimates of the parameter. • FORECASTING (öngörü)the value of a future r.v. that is not observed by the sample.

  3. FORECASTING

  4. FORECASTING FROM AN ARMA MODEL THE MINIMUM MEAN SQUARED ERROR FORECASTS Observed time series, Y1, Y2,…,Yn. n: the forecast origin Observed sample Y1 Y2 ………………….. Yn Yn+1? Yn+2?

  5. FORECASTING FROM AN ARMA MODEL

  6. FORECASTING FROM AN ARMA MODEL • The stationary ARMA model for Yt is or • Assume that we have data Y1, Y2, . . . , Ynand we want to forecast Yn+l (i.e., l steps ahead from forecast origin n). Then the actual value is

  7. FORECASTING FROM AN ARMA MODEL • Considering the Random Shock Form of the series

  8. FORECASTING FROM AN ARMA MODEL • Taking the expectation of Yn+l , we have where

  9. FORECASTING FROM AN ARMA MODEL • The forecast error: • The expectation of the forecast error: • So, the forecast in unbiased. • The variance of the forecast error:

  10. FORECASTING FROM AN ARMA MODEL • One step-ahead (l=1)

  11. FORECASTING FROM AN ARMA MODEL • Two step-ahead (l=2)

  12. FORECASTING FROM AN ARMA MODEL • Note that, • That’s why ARMA (or ARIMA) forecasting is useful only for short-term forecasting.

  13. PREDICTION INTERVAL FOR Yn+l • A 95% prediction interval for Yn+l (l steps ahead) is For one step-ahead the simplifies to For two step-ahead the simplifies to • When computing prediction intervals from data, we substitute estimates for parameters, giving approximate prediction intervals

  14. REASONS NEEDING A LONG REALIZATION • Estimate correlation structure (i.e., the ACF and PACF) functions and get accurate standard errors. • Estimate seasonal pattern (need at least 4 or 5 seasonal periods). • Approximate prediction intervals assume that parameters are known (good approximation if realization is large). • Fewer estimation problems (likelihood function better behaved). • Possible to check forecasts by withholding recent data . • Can check model stability by dividing data and analyzing both sides.

  15. REASONS FOR USING A PARSIMONIOUS MODEL • Fewer numerical problems in estimation. • Easier to understand the model. • With fewer parameters, forecasts less sensitive to deviations between parameters and estimates. • Model may applied more generally to similar processes. • Rapid real-time computations for control or other action. • Having a parsimonious model is less important if the realization is large.

  16. EXAMPLES • AR(1) • MA(1) • ARMA(1,1)

  17. UPDATING THE FORECASTS • Let’s say we have n observations at time t=n and find a good model for this series and obtain the forecast for Yn+1, Yn+2and so on. At t=n+1, we observe the value of Yn+1. Now, we want to update our forecasts using the original value of Yn+1 and the forecasted value of it.

  18. UPDATING THE FORECASTS The forecast error is We can also write this as

  19. UPDATING THE FORECASTS n=100

  20. FORECASTS OF THE TRANSFORMED SERIES • If you use variance stabilizing transformation, after the forecasting, you have to convert the forecasts for the original series. • If you use log-transformation, you have to consider the fact that

  21. FORECASTS OF THE TRANSFORMED SERIES • If X has a normal distribution with mean  and variance 2, • Hence, the minimum mean square error forecast for the original series is given by

  22. MEASURING THE FORECAST ACCURACY

  23. MEASURING THE FORECAST ACCURACY

  24. MEASURING THE FORECAST ACCURACY

  25. MOVING AVERAGE AND EXPONENTIAL SMOOTHING • This is a forecasting procedure based on a simple updating equations to calculate forecasts using the underlying pattern of the series. Not based on ARIMA approach. • Recent observations are expected to have more power in forecasting values so a model can be constructed that places more weight on recent observations than older observations.

  26. MOVING AVERAGE AND EXPONENTIAL SMOOTHING • Smoothed curve (eliminate up-and-down movement) • Trend • Seasonality

  27. SIMPLE MOVING AVERAGES • 3 periods moving averages Yt = (Yt-1 + Yt-2 + Yt-3)/3 • Also, 5 periods MA can be considered.

  28. SIMPLE MOVING AVERAGES • One can impose weights and use weighted moving averages (WMA). EgYt = 0.6Yt-1+ 0.3Yt-2+ 0.1Yt-2 • How many periods to use is a question; more significant smoothing-out effect with longer lags. • Peaks and troughs (bottoms) are not predicted. • Events are being averaged out. • Since any moving average is serially correlated, any sequence of random numbers could appear to exhibit cyclical fluctuation.

  29. SIMPLE MOVING AVERAGES • Exchange Rates: Forecasts using the SMA(3) model

  30. SIMPLE EXPONENTIAL SMOOTHING (SES) • Suppressing short-run fluctuation by smoothing the series • Weighted averages of all previous values with more weights on recent values • No trend, No seasonality

  31. SIMPLE EXPONENTIAL SMOOTHING (SES) • Observed time series Y1, Y2, …, Yn • The equation for the model is where : the smoothing parameter, 0    1 Yt: the value of the observation at time t St: the value of the smoothed obs. at time t.

  32. SIMPLE EXPONENTIAL SMOOTHING (SES) • The equation can also be written as • Then, the forecast is

  33. SIMPLE EXPONENTIAL SMOOTHING (SES) • Why Exponential?: For the observed time series Y1,Y2,…,Yn, Yn+1 can be expressed as a weighted sum of previous observations. where ci’s are the weights. • Giving more weights to the recent observations, we can use the geometric weights (decreasing by a constant ratio for every unit increase in lag).

  34. SIMPLE EXPONENTIAL SMOOTHING (SES) • Then, St+1 St

  35. SIMPLE EXPONENTIAL SMOOTHING (SES) • Remarks on  (smoothing parameter). • Choose  between 0 and 1. • If  = 1, it becomes a naive model; if  is close to 1, more weights are put on recent values. The model fully utilizes forecast errors. • If  is close to 0, distant values are given weights comparable to recent values. Choose  close to 0 when there are big random variations in the data. •  is often selected as to minimize the MSE.

  36. SIMPLE EXPONENTIAL SMOOTHING (SES) • Remarks on  (smoothing parameter). • In empirical works, 0.05    0.3 commonly used. Values close to 1 are used rarely. • Numerical Minimization Process: • Take different  values ranging between 0 and 1. • Calculate 1-step-ahead forecast errors for each . • Calculate MSE for each case. • Choose  which has the min MSE.

  37. SIMPLE EXPONENTIAL SMOOTHING (SES) • EXAMPLE: • Calculate this for =0.2, 0.3,…,0.9, 1 and compare the MSEs. Choose  with minimum MSE

  38. SIMPLE EXPONENTIAL SMOOTHING (SES) • Some softwares automatically chooses the optimal  using the search method or non-linear optimization techniques. INITIAL VALUE PROBLEM • Setting S1to Y1 is one method of initialization. • Take the average of, say first 4 or 5 observations and use this as an initial value.

  39. DOUBLE EXPONENTIAL SMOOTHING OR HOLT’S EXPONENTIAL SMOOTHING • Introduce a Trend factor to the simple exponential smoothing method • Trend, but still no seasonality SES + Trend = DES • Two equations are needed now to handle the trend. Trend term is the expected increase or decrease per unit time period in the current level (mean level)

  40. HOLT’S EXPONENTIAL SMOOTHING • Twoparameters :  = smoothingparameter  = trend coefficient • h-step ahead forecast at time t is • Trend prediction is added in the h-step ahead forecast. Current level Current slope

  41. HOLT’S EXPONENTIAL SMOOTHING • Now, we have two updated equations. The first smoothing equation adjusts St directly for the trend of the previous period Tt-1 by adding it to the last smoothed value St-1. This helps to bring Stto the appropriate base of the current value. The second smoothing equation updates the trend which is expressed as the difference between last two values.

  42. HOLT’S EXPONENTIAL SMOOTHING • Initial value problem: • S1 is set to Y1 • T1=Y2Y1or (YnY1)/(n1) • and  can be chosen as the value between 0.02< ,<0.2 or by minimizing the MSE as in SES.

  43. HOLT’S EXPONENTIAL SMOOTHING • Example: (use  = 0.6, =0.7; S1= 4, T1= 1)

  44. HOLT-WINTER’S EXPONENTIAL SMOOTHING • Introduce both Trend and Seasonality factors • Seasonality can be added additively or multiplicatively. • Model (multiplicative):

  45. HOLT-WINTER’S EXPONENTIAL SMOOTHING Here, (Yt /St) captures seasonal effects. s = # of periods in the seasonal cycles (s = 4, for quarterly data) Three parameters :  = smoothing parameter  = trend coefficient  = seasonality coefficient

  46. HOLT-WINTER’S EXPONENTIAL SMOOTHING • h-step ahead forecast • Seasonal factor is multiplied in the h-step ahead forecast • , and  can be chosen as the value between 0.02< ,,<0.2 or by minimizing the MSE as in SES.

  47. HOLT-WINTER’S EXPONENTIAL SMOOTHING • To initialize Holt-Winter, we need at least one complete season’s data to determine the initial estimates of It-s. • Initial value:

  48. HOLT-WINTER’S EXPONENTIAL SMOOTHING • For the seasonal index, say we have 6 years and 4 quarter (s=4). STEPS TO FOLLOW STEP 1: Compute the averages of each of 6 years.

  49. HOLT-WINTER’S EXPONENTIAL SMOOTHING • STEP 2: Divide the observations by the appropriate yearly mean.

  50. HOLT-WINTER’S EXPONENTIAL SMOOTHING • STEP 3: The seasonal indices are formed by computing the average of each row such that

More Related