1 / 43

Time Series Analysis

Time Series Analysis. Negar Koochakzadeh. Outline. Introduction: Time Series Data Stationary / Non-stationary TS Data Existing TSA Models AR (Auto-Regression) MA (Moving Average) ARMA (Auto-Regression Moving Average) ARIMA (Auto-Regression Integrated Moving Average)

pahana
Download Presentation

Time Series Analysis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Time Series Analysis NegarKoochakzadeh

  2. Outline • Introduction: • Time Series Data • Stationary / Non-stationary TS Data • Existing TSA Models • AR (Auto-Regression) • MA (Moving Average) • ARMA (Auto-Regression Moving Average) • ARIMA (Auto-Regression Integrated Moving Average) • SARIMA (Seasonal ARIMA) • Examples • Example 1: International Airline Passenger • Example 2&3: Energy Load Prediction • Time Series Data Mining • Time Series Classification (SVM) • Example • Example 4:Stock Market Analysis

  3. Time Series Data • In many fields of study, data is collected from a system over time. • This sequence of observations generated a time series: • Examples: • Closing prices of the stock market • A country’s unemployment rate • Temperature readings of an industrial furnace • Sea level changes in coastal regions • Number of flu cases in a region • Inventory levels at a production site

  4. Temporal Behaviour • Most physical processes do not change quickly, often makes consecutive observation correlated. • Correlation between consecutive observation is called autocorrelation. • Most of the standard modeling methods based on the assumption of independent observations can be misleading. • We need to consider alternative methods that take into account the serial dependence in the data.

  5. Stationary Time Series Data • Stationary time series are characterized by having a distribution that is independent of time shifts. • Mean and variance of these time series are constants • If arbitrary snapshots of the time series we study exhibit similar behaviour in central tendency and spread, we can assume that the time series is indeed stationary.

  6. Stationary or Non-Stationary? • In practice, there is no clear demarcation line between a stationary and a non-stationary process. • Some methods to identify: • Visual inspection • Using intuition and knowledge about the process • Autocorrelation Function (ACF) • Variogram

  7. Visual Inspection • A properly constructed graph of a time series can dramatically improve the statistical analysis and accelerate the discovery of the hidden information in the data. • “You can observe a lot by watching.” This is particularly true with time series data analysis! [Yogi Berra, 1963]

  8. Intuition and knowledge Inspection • Does it make sense... • for a tightly controlled chemical process to exhibit similar behaviour in mean and variance in time? • to expect the stock market out it “to remain in equilibrium about a constant mean level” • The selection of a stationary or non-stationary model must often be made on the basis of not only the data but also a physical understanding of the process.

  9. Autocorrelation Function (ACF) • Autocorrelation is the cross-correlation of a time series data with itself based on lag k • ACF summarizes as a function of k, how correlated the observations that are k lags apart are. • If the ACF does not dampen out then the process is likely not stationary (If a time series is non-stationary, the ACF will not die out quickly)

  10. Variogram • The Variogram Gk measures the variance of differences k time units apart relative to the variance of the differences one time unit apart • For stationary process, Gk when plotted as a function of k will reach an asymptote line. However, if the process is non-stationary, Gk will increase monotonically.

  11. Modeling and Prediction • “If we wish to make predictions, then clearly we must assume that something does not vary with time.” [Brockwell and Davis, 2002] • Let’s try to predict and build a model for our time series process based on: • Serial Dependency • Leading Indicators • Disturbance • True disturbances caused by unknown and/or uncontrollablefactors that have direct impact on the process. • It is impossible to come up with a comprehensive deterministic model to account for all these possible disturbances, since by definition they are unknown. • In these cases, a probabilistic or stochastic model will be more appropriate to describe the behaviour of the process.

  12. Notations • Backshift Operator

  13. Auto-Regressive Models • AR(P) • Where at is an error term (called white error) assumed to be uncorrelated with zero mean and constant variance. • The random error at cannot be observed. Instead we estimate it by using the one-step-ahead forecast error • The regression coefficients , i = 1, ... , p, are parameters to be estimated from the data

  14. Moving Average • Current and previous disturbances affect the value. • We have a sequence of random shocks bombarding the system and not just a single shock. • MA(q) • Uncorrelated random shocks with zero mean and constant variance • The coefficients , i = 1, ... , q are parameters to be determined from the data

  15. Auto-Regressive Moving Average • ARMA(p,q) • Typical stationary time series models come in three general classes, auto-regressive (AR) models, moving average (MA) models, or a combination of the two (ARMA).

  16. Identifying appropriate Model • The ACF plays an extremely crucial role in the identification of time series models • The identification of the particular model within ARMA class of models is determined by looking at the ACF and PACF.

  17. Partial Autocorrelation Function (PACF) • Partial Autocorrelation is the partial cross-correlation of a time series data with itself based on lag k • Partial correlation is a conditional correlation: • It is the correlation between two variables under the assumption that we know and take into account the values of some other set of variables • How Zt and Zt-kare correlated taking into account how both Zt and Zt-kare related to Zt-1, Zt-2, ... , Zt-k+1 • The kth order PACF measure correlation between Zt and Zt+k after adjustments have been made for the intermediate observations Zt-1, Zt-2, ... , Zt-k+1 • where   denotes the projection of x onto the space spanned by Zt-1, Zt-2, ... , Zt-k+1

  18. ARMA Model identification from ACF and PACF MA(q) AR(p) ARMA(p, q) Infinite damped exponentials and/or damped sine waves; Tails off Finite; cuts off after q lags Infinite damped exponentials and/or damped sine waves; Tails off ACF Infinite damped exponentials and/or damped sine waves; Tails off Infinite damped exponentials and/or damped sine waves; Tails off Finite; cuts off after p lags PACF Source: Adapted from BJR

  19. Examples

  20. Models for Non-Stationary Data • Standard autoregressive moving average (ARMA) time series models apply only to stationary time series. • The assumption that a time series is stationary is quite unrealistic. (Stationary is not natural!) • For a system to exhibit a stationary behaviour, it has to be tightly controlled and maintained in time. • Otherwise, systems will tend to drift away from stationary

  21. Converting Non-Stationary Data to Stationary • More realistic is to claim that the changes to a process, or the first difference, form a stationary process. • And if that is not realistic, we mat try to see if the changes of the changes, the second difference, form a stationary process. • If that is the case, we can then model the changes, make forecasts about the future values of these changes, and from the model of the changes build models and create forecasts of the original non-stationary time series. • In practice, we seldom need to go beyond second order differencing.

  22. Auto Regressive Integrated Moving Average(ARIMA) • In the case of non-stationary data, differencing before we use the (stationary) ARMA model to fit the (differenced) data is appropriate. • Because the inverse operation of differencing is summing or integrating, an ARMA model applied to d differenced data is called an autoregressive integrated moving average process, ARIMA (p, d, q). • In practice, the orders p, d, and q are seldom higher than 2.

  23. Stages of the time series model building process using ARIMA Consider a general ARIMA Model Identify the appropriate degree of differencing if needed Using ACF and PACF, find a tentative model Estimate the parameters of the model using appropriate software Perform the residual analysis. Is the model adequate? Start forecasting

  24. Model Evaluation • Once a model has been fitted to the data, we process to conduct a number of diagnostic checks. • If the model fits well, the residuals should essentially behave like white noise. • In other words, the residuals should be uncorrelated with constant variance. • Standard checks are to compute the ACF and PACF of the residuals. • If they appear in the confidence interval there is no alarm indications that the model does not fit well.

  25. Exponentially Weighted Moving Average • Special case of ARIMA model: EWMA • Unlike a regular average that assigns equal weight to all observation, an EWMA has a relatively short memory that assigns decreasing weights to past observations. • EWMA made practical sense that a forecast should be a weighted average that assigns most weight to the most immediate past observation, somewhat less weight to the second to the last observation, and so on. • It just made good practical sense.

  26. Seasonal Models • For ARIMA models, the serial dependence of the current observation to the previous observations was often strongest for the immediate past and followed a decaying pattern as we move further back in time. • For some systems, this dependence shows a repeating, cyclic behaviour. • This cyclic pattern or as more commonly called seasonal pattern can be effectively used to further improve the forecasting performance. • The ARIMA models are flexible enough to allow for modeling both seasonal and non-seasonal dependence.

  27. Example 1: International Airline Passengers

  28. Trend and Seasonal Relationship • Two relationship going on simultaneously: • Between observations for successive months within the same year • Between observation for the same month in successive years. • Therefore, we essentially need to build two time series models, and then combine the two. • If the season is s period long, in this example s = 12 months, then observation that are s time intervals apart are alike.

  29. Pre-Processing Log Transformation

  30. Apply Differencing on Seasonal Data • For seasonal data, we may need to use not only regular difference but also a seasonal difference . • Sometimes, we may even need both (e.g., ) to obtain an ACF that dies out sufficiently quickly.

  31. Investigate ACFs • Only the last one (combination of regular difference and seasonal difference) is stationary:

  32. Model Identification • Identifying stationary seasonal models is a modification of the one used for regular ARMA time series models where the patterns of the sample ACF and PACF provide guidance. • First, look for similarities that are 12 lags apart. • ACF seems to cut off after the first one (in k=12). • This is a sign of a Moving Average Model applied to the 12-month seasonal pattern. • Second, look for patterns between successive months • ACF seems to cut off after the first one • First order MA term in the regular model ACF PACF

  33. Model Evaluation • ACF of the residuals after fitting a first order SMA model to : • We see that the ACF shows a significant negative spike at lag 1, indicating that we need an additional regular moving average term

  34. ARIMA (p,d,q)*(P,D,Q)12

  35. Example 2: Energy Peak Load Prediction • The hourly peak load follows a daily periodic pattern • S=24 hours • Covert peak load values into and then apply ARMA PACF ACF

  36. Example 3: Energy Load Prediction • Daily, weekly, and monthly periodic patterns • Exogenous Variables (Temperature) • They proposed to apply Periodic Auto-Regression (PAR) * An auto-regression is periodic when the parameters are allowed to vary across seasons.

  37. Example 3 (cont’d) Seasonality varying intercept term • Proposed model template: Dummy variable for weekly seasonal Dummy variable for monthly seasonal Exogenous variable for temperature sensitivity

  38. Time Series Data Mining • Using Serial Dependency of forecasting variable to build the training set. • Leading indicators might exhibit similar behaviour to forecasting variable • The important task is to find out whether there exists a lagged relationship between indicators and predicted variable • If such a relationship exists, then from the current and past behaviour of the leading indicators, it may be possible to determine how the sales will behave in the near future.

  39. Time Series SVM • Optimization problem in SVM: Error in SVM: Error in Modified SVM:

  40. Example 4: Stock Market Analysis • Portfolio optimization is the decision process of asset selection and weighting, such that the collection of assets satisfies an investor’s objectives • Serial dependency or Lagged Relationship between stock performance and financial indicators from the companies.

  41. Stock Ranking • Learn relationship between stocks’ current features and their future rank score. (Lagged Relationship) • By Applying modified version of SVM Rank Algorithm for time series based on exponential weighted error.

  42. References [1] SørenBisgaard and M. Kulahci, TIME SERIES ANALYSIS AND FORECASTING BY EXAMPLE: A JOHN WILEY & SONS, INC., 2011. [2] RaymanPreet Singh, Peter Xiang Gao, and Daniel J. Lizotte, "On Hourly Home Peak Load Prediction," in IEEE SmartGridComm, 2012. [3] Marcelo Espinoza, Caroline Joye, Ronnie Belmans, and Bart De Moor, "Short-Term Load Forecasting, ProfileIdentification, and Customer Segmentation: A Methodology Based on Periodic Time Series," Power Systems, vol. 20, pp. 1622-1630, 2005. [4] F. E. H. Tay and L. Cao, "Modified support vector machines in financial time series forecasting," Neurocomputing, vol. 48, pp. 847-861, 2002

  43. Questions?

More Related