Some Useful Econometric Techniques
Download
1 / 43

Some Useful Econometric Techniques - PowerPoint PPT Presentation


  • 94 Views
  • Uploaded on

Some Useful Econometric Techniques. Selcuk Caner. Outline. Descriptive Statistics Ordinary Least Squares Regression Tests and Statistics Violation of Assumptions in OLS Estimation Multicollinearity Heteroscedasticity Autocorrelation Specification Errors Forecasting

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Some Useful Econometric Techniques' - travis-collins


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

Outline
Outline

  • Descriptive Statistics

  • Ordinary Least Squares

  • Regression Tests and Statistics

  • Violation of Assumptions in OLS Estimation

    • Multicollinearity

    • Heteroscedasticity

    • Autocorrelation

  • Specification Errors

  • Forecasting

  • Unit Roots, Spurious Regressions, and cointegration


Descriptive statistics
Descriptive Statistics

  • Useful estimators summarizing the probability distribution of a variable:

  • Mean

  • Standard Deviation


Descriptive statistics cont
Descriptive Statistics (Cont.)

  • Skewness (symmetry)

  • Kurtosis (thickness)


Ordinary least squares ols
Ordinary Least Squares (OLS)

  • Estimation

    • Model

    • The OLS requires:

      • Linear relationship between Y and X,

      • X is nonstochastic,

      • E(et) = 0 , Var(et) = s2 and Cov(et, es)=0

        for t not equal to s.


Ordinary least squares ols cont
Ordinary Least Squares (OLS) (Cont.)

  • The OLS estimator for b0 and b1 are found by minimizing the sum of squared errors (SSE):



Ordinary least squares ols cont2
Ordinary Least Squares (OLS) (Cont.)

  • Minimizing the SSE is equivalent to:

  • Estimators are:


Ordinary least squares ols cont3
Ordinary Least Squares (OLS) (Cont.)

  • Properties of OLS estimators:

    • They are normally distributed

    • Minimum variance and unbiased estimators


Ordinary least squares ols cont4
Ordinary Least Squares (OLS) (Cont.)

Multiple regression, in matrix form,

Y= Tx1 vector of dependent variables

X = Txk matrix of independent variables (first column is all ones)

b= kx1 vector of unknown parameters

e = Tx1 vector of error terms


Ordinary least squares ols cont5
Ordinary Least Squares (OLS) (Cont.)

Estimator of the multiple regression model:

  • X’X is the variance-covariance matrix of the components of X.

  • X’Y is the vector of covariances between X and Y.

  • It is an unbiased estimator and normally distributed


Example private investment
Example: Private Investment

  • FIRt = b0 + b1RINT t-1 + b2INFL t-1 + b3RGDP t-1 + b4NKFLOW t-1 + et

  • One can run this regression to estimate private fixed investment

    • A negative function of real interest rates (RINT)

    • A negative function of inflation (INFL)

    • A positive function of real GDP (RGDP)

    • A positive function of net capital flows (NKFLOW)


Regression statistics and tests
Regression Statistics and Tests

  • R2 is the measure if goodness of fit:

  • Limitations:

    • Depends on the assumption that the model is correctly specified

    • R2 is sensitive to the number of independent variables

    • If intercept is constrained to be equal to zero, then R2 may be negative.


Meaning of r 2
Meaning of R2

o

o

o

o

o

o

o

o

X

Xt


Regression statistics and tests1
Regression Statistics and Tests

  • Adjusted R2 to overcome limitations of

  • R2 = 1-SSE/(T- K)/TSS/(T-1)

  • Is bi statistically different from zero?

  • When et is normally distributed, use t-statistic to test the null hypothesis bi = 0.

    • A simple rule: if t(T-k) > 2 then bi is significant.


Regression statistics and tests2
Regression Statistics and Tests

  • Testing the model:

    • F-test: F-statistics with k-1 and T-k degrees of freedom is used to test for the null hypothesis:

    • b1=b2=b3=…=b k=0

    • The f-statistics is:

    • The F test may allow the null hypothesis b1=b2=b3=…=b k=0 to be rejected even when none of the coefficients are statistically significant by individual t-tests.


Violations of ols assumptions
Violations of OLS Assumptions

  • Multicollinearity

    • When 2 or more variables are correlated (in the multi variable case) with each other. E.g.,

    • Result: high standard errors for the parameters and statistically insignificant coefficients.

    • Indications:

      • Relatively high correlations between one or more explanatory variables.

      • High R2 with few significant t-statistics. Why?



Violations of ols assumptions cont1
Violations of OLS Assumptions (Cont.)

  • Heteroscedasticity: when error terms do not have constant variances s2.

    • Consequences for the OLS estimators:

    • They are unbiased [E(b)=b] but not efficient. Their variances are not the minimum variance.

    • Test: White’s heteroscedasticty test.

  • If there are ARCH effects, use the GARCH models to account for volatility clustering effects.


Violations of ols assumptions cont2
Violations of OLS Assumptions (Cont.)

  • Autocorrelation: when the error terms from different time periods are correlated [et=f(et-1,et-2,…)]:

    • Consequences for the OLS estimators:

      • They are unbiased [E(b)=b] but not efficient.

    • Test for serial correlation: Durbin-Watson for first order serial correlation:


Violations of ols assumptions cont3
Violations of OLS Assumptions (Cont.)

  • Autocorrelation (cont.):

  • Test for serial correlation (cont.)

  • Durbin-Watson statistic (cont.)

  • The DW statistic is approximately equal to:

    where

  • Note, if r1=1 then DW =0. If r1=-1 then DW =4. For r1=0,

    DW =2.

  • Ljung-Box Q test statistic for higher order correlation.


Specification errors
Specification Errors

  • Omitted variables:

    • True model:

    • Regression model:

    • Then, the estimator for b1 is biased.


Specification errors cont
Specification Errors (Cont.)

  • Irrelevant variables:

    • True model:

    • Regression model:

    • Then, the estimator for b1 is still unbiased. Only efficiency declines, since the variance of b1* will be larger than the variance of b1.


A Naïve Estimation

  • Estimate aggregate demand elasticity e

  • Using historical consumption data:

  • Estimate the regression equation:

    • ln(QDt) = a + b*ln(Pt)

    • b is an estimate of e

  • Forecast change in consumption price, DP

  • Estimate change in demand as:

    • (DQD/QD)F = e * (DP/P)F




Error Correction Model (ECM) for Non-Stationarity

  • One can try regression of first differences.

  • However, first differences do not use information on levels.

  • It mixes long-term relationship with the short-term changes.

  • Error correction model (ECM) can separate long-term and short-term relationships.



Interpretation of the Estimated Regression

  • ln QDt – lnQD t-1 = -5.327-0.348*(lnCPIt – lnCPI t-1)

    – 0.697* (lnQD t-1- 1.267 lnCPI t-1)

Short-run Effect

Long-run Effect

Error Correction Coefficient


Forecasting
Forecasting

  • A forecast is:

    • A quantitative estimate about the likelihood of future events which is developed on the basis of current and past information.

    • Some useful definitions:

    • Point forecast: predicts a single number for Y in each forecast period

    • Interval forecast: indicates an interval in which the realized value of Y will lie.


Unconditional forecasting
Unconditional Forecasting

  • First estimate the econometric model

  • Then, compute:

    assuming XT+1 is known. This is the point forecast.


Unconditional forecasting cont
Unconditional Forecasting (Cont.)

  • The forecast error is:

  • The 95% confidence interval for YT+1 is:

  • where

  • Which provides a good measure of the precision of the forecast.


Conditional forecasting
Conditional Forecasting

  • If XT+1 is not known and needs to be forecasted.

    • The stochastic nature of the predicted values for Xs leads to forecasts that are less reliable.

    • The forecasted value of Y at time T+1 is


Unit roots spurious regressions and cointegration
Unit Roots, Spurious Regressions, and Cointegration

  • Simulate the processes

  • where et ~N(0,4) and

  • where ut ~N(0,9).


Unit roots spurious regressions and cointegration cont
Unit Roots, Spurious Regressions, and Cointegration (Cont.)

  • Spurious regressions:

    • Granger and Newbold(1974) demonstrated that macroeconomic variable data are trended upwards and that in regressions involving the levels of such data, the standard significance tests are misleading. The conventional t and F tests reject the hypothesis of no relationship when in fact there might be one.

    • Symptom: R2 > DW is a good rule of thumb to suspect that the estimated regression is spurious.


Unit roots spurious regressions and cointegration cont1
Unit Roots, Spurious Regressions, and Cointegration (Cont.)

  • Unit roots:

    • If a variable behaves like

    • Then its variance will be infinite since,

    • This is a non-stationary variable. E.g.,

      where et ~N(0,4). This would result with a forever increasing series.


Unit roots spurious regressions and cointegration cont2
Unit Roots, Spurious Regressions, and Cointegration (Cont.)

  • The series can be made stationary by taking first difference of Yt,

  • The series has finite variance and is a stationary variable. The original series Yt is said to be integrated of order one [I(1)].


Unit roots spurious regressions and cointegration cont3
Unit Roots, Spurious Regressions, and Cointegration (Cont.)

  • A trend-stationary variable

    also has a finite variance.

  • The process

    is non-stationary and does not have a finite variance.


Unit roots spurious regressions and cointegration cont4
Unit Roots, Spurious Regressions, and Cointegration (Cont.)

  • But the variable,

  • Is stationary and has a finite variance if abs(r)<1. E.g.,

    where et ~N(0,4).


Unit roots spurious regressions and cointegration cont5
Unit Roots, Spurious Regressions, and Cointegration (Cont.)

  • Tests for unit roots: Dickey-Fuller Test

    • Case of I(1)

    • Null hypothesis:

    • Alternative hypothesis:

  • Run regression:

  • And test (r-1)=0 by comparing the t-statistic with MacKinnon critical values for rejection of the hypothesis of a unit root.


Unit roots spurious regressions and cointegration cont6
Unit Roots, Spurious Regressions, and Cointegration (Cont.)

Case of Random Walk (RW)

  • Null hypothesis:

  • Alternative hypothesis:

  • Run regression:

  • And test (r-1)=0 by comparing the t-statistic with MacKinnon critical values for rejection of the hypothesis of a unit root.


  • Unit roots spurious regressions and cointegration cont7
    Unit Roots, Spurious Regressions, and Cointegration (Cont.)

    • DF tests on macroeconomic variables:

    • Most macroeconomic flows and stocks related to the population size such as output, consumption or employment are I(1) while price levels are I(2). E.g., GDP is I(1) while interest rates are I(2).


    Unit roots spurious regressions and cointegration cont8
    Unit Roots, Spurious Regressions, and Cointegration (Cont.)

    • Cointegration

      • If two series are both I(1), there may be a b1 such that

      • Is I(0). The implication is that the two series are drifting upward together at roughly the same rate.

      • Two series satisfying the above requirement are said to be cointegrated and the vector

        is a cointegrating vector.


    ad