1 / 35

Business Forecasting

Business Forecasting. Chapter 6 Adaptive Filtering as a Forecasting Technique. Chapter Topics. Introduction How the Model is Used in Forecasting Chapter Summary. Introduction. The adaptive filtering approach also depends heavily on historical observations.

gage-pace
Download Presentation

Business Forecasting

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. BusinessForecasting Chapter 6 Adaptive Filtering as a Forecasting Technique

  2. Chapter Topics • Introduction • How the Model is Used in Forecasting • Chapter Summary

  3. Introduction • The adaptive filtering approach also depends heavily on historical observations. • More complicated data patterns such as those with cycles are easily accommodated. • The model makes it possible to learn from past errors and correct for it before making a final forecast.

  4. Introduction • In both the moving average and the exponential smoothing models, the forecaster depended on selecting the right value for the smoothing constant or the weight. • In the adaptive filtering technique, the model seeks to determine the “best” set of weights that needs to be used in making a forecast.

  5. Introduction • The model is capable of generating information about past inaccuracy and corrects itself.

  6. Basic Elements of the Technique • The technique has two distinct phases: • The first phase is the adapting or training of a set of weights with historical data. • The second phase uses these weights to make a forecast. • As in the previous methods, we compute the error for the forecast. • The weights are adjusted based on the errors, and a new forecast is made.

  7. Basic Elements of the Technique • Note how we forecast for period 5, for example, using the weights: • The error will be computed as:

  8. Adapting and Training Process

  9. Training Phase • In the training phase we use the following equation to adjust or revise the weights:

  10. Training Phase = the old i th weight k= a constant term referred to as the learning constant = the standardized error of forecast in the period = the observed value at period t– i +1 i = 1, 2, …, p (p = number of weights) t = p +1, p +2, … , N (N = number of observations) y = the largest of the most recent N values of Y.

  11. Training Phase • As you can see, the revised set of weights is equal to the old set of weights plus some adjustments made for the error. • The adjustment for each weight is based on: • The observed value • The value of the learning constant k • The error for that forecast.

  12. Training Phase • The learning constant allows the weights to be changed automatically as the time series changes its patterns. • The steps for this adjustment process involves: • Specifying the number of weights • Specifying the learning constant (k).

  13. Training Phase • There are at least two ways of assigning the initial weights: • Forecaster uses his/her judgment to assign the weight. • A statistical approach is used to determine the weight as shown below: where N = the number of observations in the data series.

  14. Training Phase • For quarterly data, N would equal 4 and for monthly data, N would equal 12 • The minimum number of weights that can be used in adaptive filtering is two. • The learning constant (k) has a value between 0 and 1.

  15. Training Phase • If we choose a higher value for k, then we can expect rapid adjustment to changes in patterns, making it difficult to find the optimal weight. • If we select a small value for k, the number of iterations needed to reach optimal weights may increase significantly.

  16. Training Phase • As a rule, k can be set to equal to 1/p where p is the number of weights. • Alternative values of k are used, and the one that has the smallest standard error is selected. • When data have a great deal of variation, a smaller k is recommended.

  17. Example

  18. Example Step 1: Graph the data to see what observations can be made from the scatter plot.

  19. Example

  20. Example Step 2: Select the weight. Given that we have quarterly data, the weight will be: 1/n =0.25 for each quarter. Step 3: Since there is seasonality in the data, we may want to give more weight to the quarter prior to the one for which the forecast is being made.

  21. Example Suppose we arbitrarily give the following weights for each quarter: Q1 = 0.2 Q2 = 0.2 Q3 = 0.2 Q4 = 0.4

  22. Example Step 4: The forecast for the first quarter of 2006 will be: Step 5: The error associated with this forecast will be: Error = Observed value in period 5 – Forecast in period 5 = 110−107= 3.0

  23. Example Step 6: Compute a new adjusted (or optimized) set of weights. To do this, we have to know the learning constant (k). We can set kto equal 1/p, which in this case would be 0.25, since we have used four weights.

  24. Example Step 7: Compute a new adjusted set of weights:

  25. Example Step 8: The forecast for period 6, the second quarter of 2006, is now based on the new weights that we have just computed.

  26. Example Step 9: This process of using the newly adjusted weights is used to compute the forecasts for the subsequent quarters (Table 6.4).

  27. Example

  28. Example Step 10: Refine the weights by substituting those weights that are generated at the end of the training period:

  29. Example Step 11: Using the computer program provided with this book, this process is continued until we have minimized the standard error and no more reduction is noted with repeated iterations.

  30. Example

  31. Example Step 11: The forecast for period 13 will be:

  32. Chapter Summary • The adaptive filtering approachdepends heavily on historical observations. • But, in this technique more complicated data patterns such as those with cycles are easily accommodated. • Additionally, adaptive filtering makes it possible to learn from past errors and correct for it before making a final forecast.

  33. Chapter Summary • In comparison with the moving average and the exponential smoothing models, the adaptive filtering model seeks to determine the “best” set of weights that needs to be used in making a forecast. • The model is capable of generating information about past inaccuracy and corrects itself.

  34. Chapter Summary • The technique has two distinct phases. The first phase is the adapting or training of a set of weights with historical data, and the second is to use these weights to make a forecast. • We discussed how the weights are selected at the beginning and how finally the revised weights are used in making a forecast.

  35. Chapter Summary • Two important elements of the technique are the selection of the weights and the learning constant (k). • Use of the computer program specially designed for this chapter makes the process of forecasting very easy.

More Related