Sales forecasting using dynamic bayesian networks
This presentation is the property of its rightful owner.
Sponsored Links
1 / 27

Sales Forecasting using Dynamic Bayesian Networks PowerPoint PPT Presentation


  • 181 Views
  • Uploaded on
  • Presentation posted in: General

Sales Forecasting using Dynamic Bayesian Networks. Steve Djajasaputra SNN Nijmegen The Netherlands. Table of Content. Why Sales Forecasting? Method Results & Discussions Conclusions Further Research Acknowledgements. 1. Why Sales Forecasting?.

Download Presentation

Sales Forecasting using Dynamic Bayesian Networks

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Sales forecasting using dynamic bayesian networks

Sales Forecasting using Dynamic Bayesian Networks

Steve Djajasaputra

SNN Nijmegen

The Netherlands


Table of content

Table of Content

  • Why Sales Forecasting?

  • Method

  • Results & Discussions

  • Conclusions

  • Further Research

  • Acknowledgements

Steve Djajasaputra (SNN Nijmegen)


1 why sales forecasting

1. Why Sales Forecasting?

  • Sales Forecasting bring advantage for your business:

    • reducing logistic cost

    • improving your services

      • targeted marketing

      • lower backorder

But in practice… is this really happening?

Steve Djajasaputra (SNN Nijmegen)


The answer is yes an example of success story

The Answer is… YES!An Example of Success Story:

  • Bayesian statistical technology for predicting newspaper sales

    • 1 to 4% more sales with same deliveries

    • 3 to 12 % less deliveries to achieve same total amount of sales

Steve Djajasaputra (SNN Nijmegen)


But time series forecasting is not always easy

But time-series forecasting is not always easy!

So….

  • Searching betterforecasting technology

  • Aggregation of different group of products can be helpful

  • Clustering methodology for aggregation

  • Bayesian Methodology: generative model

Steve Djajasaputra (SNN Nijmegen)


2 method

2. Method

  • Dynamic Bayesian Networks

  • Forecasting

  • The Inputs

Steve Djajasaputra (SNN Nijmegen)


Dynamic bayesian networks

Dynamic Bayesian Networks

  • Y is our observation

    • e.g. sales of different products: beer y1, beer y2,…

    • X-axis: the time t (e.g.weeks)

Steve Djajasaputra (SNN Nijmegen)


Dynamic bayesian networks1

Dynamic Bayesian Networks

  • In our model, we assume that our observation Y is generated with this dynamic:

    • X are inputs, for example: sales of bier last week, weather information, prices, day labeling

    •  are “hidden variables”, which are unobserved/unknown

    •  is noise N(0,2)

Steve Djajasaputra (SNN Nijmegen)


Dynamic bayesian networks2

Dynamic Bayesian Networks

  • Hierarchical model:

    • Our hidden variables  depend on other unobserved/unknown hidden variables M.

  • Several  from different product share the same M.

    • A is a transition matrix for 

    • G is a transition matrix for M

    •  is noise N(0,  )

    •  is noise N(0, M )

Steve Djajasaputra (SNN Nijmegen)


Dynamic bayesian networks3

Dynamic Bayesian Networks

  • Inference & Learning:

    • We have Y & Xdata in our model

    • But we don’t know the values of hidden variables: , M and their initial values

    • We also don’t know the correct value of parameters: , M ,A,G and their initial values

    • We solve these problems in Bayesian paradigm, using EM Algorithm.

Steve Djajasaputra (SNN Nijmegen)


Forecasting

Forecasting

Steps:

  • Training step: find the model parametersand hidden variables 1:T given the data from observation window X1:T,Y1:T, using EM algorithm and Kalman smoothing.

  • Forecasting step: predict T+h and YT+hh is the horizon of forecasting

  • Updating step: update the hidden variables 1:T+h given the real value YT+h

  • Repeat the forecasting & updating steps above in iterations.

Steve Djajasaputra (SNN Nijmegen)


The inputs

The Inputs

  • By Autocorrelation & FFT Spectrum analysis, for inputs (Xi,t) I decided to use:

    • Seasonality markers

    • Recent sales (1 week ago)

    • Last month sales (4 weeks ago)

  • We need to keep the number of inputs as small as possible to avoid over-fitting.

  • Since I consider seasonality & recent sales, my model is somewhat comparable with “SC” model which is used by Pim Ouwehand.

Steve Djajasaputra (SNN Nijmegen)


3 results discussions

3. Results & Discussions

  • An Example of Result

  • Residual Analysis

  • Nonlinear Transformation

  • The Offset Problem

  • Removing Outliers

  • Our Bayesian Approach vs Conventional Econometric Methods

  • Need More Informative Inputs

Steve Djajasaputra (SNN Nijmegen)


An example of result

An Example of Result

  • Mean Absolute Deviation (MAD) is 2346 beers

    Y-axis:

    O is the real value

    X is the prediction

    X-axis: weeks

    Training steps: week 5..204

    Forecasting steps: week 205..260, 1 week horizon

  • This result is about the range of Winter method used by Pim Ouwehand.

Steve Djajasaputra (SNN Nijmegen)


Residual analysis

Residual Analysis

  • To validate our model.

  • It’s showed that the residues (error) are noise as we assumed.

    • Y predicted vs Error (figure on top)

    • Error vs time (figure on bottom)

    • Autocorrelation and FFT of Error

    • Cross correlation Error vs Inputs

Steve Djajasaputra (SNN Nijmegen)


Nonlinear transformation

Nonlinear Transformation

  • To make data more linear & gaussian since we assume our model is linear and the data is assumed to be gaussian distributed.

    e.g. Log, Sigmoid

Steve Djajasaputra (SNN Nijmegen)


The offset problem

The Offset Problem

  • Due to the stationary assumption, the software gives over(under)estimated forecasting if the trend is exist.

  • Solutions:

    • Removing trend (e.g. taking difference)

    • Updating the parameters after forecasting step.

      Legend:

      Left: moving averaged Beer-2 vs weeks

      Right: Predicted Beer-2 vs weeks

Steve Djajasaputra (SNN Nijmegen)


Removing outliers

Removing Outliers

  • The plot shows that the data is very noisy.

  • Most of the outliers are below the mean, perhaps due to “out of stock” problem. Thus it will be helpful if we can get “out of stock” label for input in our forecasting model.

Sales of 10 beers (normalized) vs weeks

Steve Djajasaputra (SNN Nijmegen)


Our bayesian approach vs conventional econometric methods

Our Bayesian Approach vs Conventional Econometric methods

  • Econometric regression methods (e.g. Winter Method used by Pim Ouwehand) works well to fit the data.

Steve Djajasaputra (SNN Nijmegen)


Sales forecasting using dynamic bayesian networks

  • However, we don’t want just do fitting the data. We want to understand the process behind the data that we observed (i.e. hidden/unobserved variables).

  • We want to have a generative model of the beer buyers.

  • This generative model helps you to understand the “hidden process” in the market. This is a valuable insight for business decision, e.g. by simulation.

Steve Djajasaputra (SNN Nijmegen)


We need more informative inputs

We need More Informative Inputs

Steve Djajasaputra (SNN Nijmegen)


4 conclusions

4. Conclusions

  • This preliminary research (only with the sales data without other informative inputs) showed that the result is about in the range of Winter method.

  • We need more informative input data for a better model.

  • Hacking data (e.g. removing trend, nonlinear transformation) slightly improves the result. But this is not the main purpose of this research.

Steve Djajasaputra (SNN Nijmegen)


Conclusions continued

Conclusions… continued

  • We are not only just fitting the data but constructing a generative model, which is useful for understanding business process behind the sales.

  • This understanding help you to shape your strategy to achieve more profit.

Steve Djajasaputra (SNN Nijmegen)


5 further research

5. Further Research

  • Clustering and Structural Learning

  • Non stationary process

  • Non linear model

  • Approximations

    • Variational

    • Factorial

    • Monte Carlo (MCMC)

Steve Djajasaputra (SNN Nijmegen)


6 acknowledgements

6. Acknowledgements

  • Our sponsor: STW

  • Tom Heskes (KUN)

  • Pim Ouwehand (TUE)

  • Bart Bakker (Phillips, was in KUN)

  • Data providers/ Business Partners : Schuitema, Technie Unie, OPG.

Steve Djajasaputra (SNN Nijmegen)


Appendix clustering insights

Appendix: Clustering Insights

  • On Observed data Y

Steve Djajasaputra (SNN Nijmegen)


Sales forecasting using dynamic bayesian networks

  • On hidden variables:

    • 1,2:seasonality

    • 3:last month

    • 4:last week

Steve Djajasaputra (SNN Nijmegen)


  • Login