- 223 Views
- Uploaded on

Download Presentation
## From Unit Root To Cointegration

**An Image/Link below is provided (as is) to download presentation**

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

### From Unit Root To Cointegration

Putting Economics into Econometrics

Many (perhaps most) macroeconomic variables are

non stationary

Many of these are difference stationary, I(1), variables.

- Economics - many variables have stable long-run relationships.
- Consumption - Income
- Prices - Wages
- Prices at home - prices abroad

We can use the unit root testing techniques to identify variables which have stable long-run relationships with one another (as opposed to spurious regression)

Example: Are Y (disposable income) and X (consumption) cointegrated?

- First be satisfied that the two time series are I(1). E.g. apply unit root tests to X and Y in turn.
- They are clearly not stationary
- But they seem to move together

Economic theory tells us that there should be (at least in the long run) a relation like

A regression between these variables gives:

EQ( 1) Modelling C by OLS (using Lecture6a.in7)

The estimation sample is: 1 to 99

Coefficient Std.Error t-value t-prob Part.R^2

Constant 4.85755 0.1375 35.3 0.000 0.9279

Y 1.00792 0.005081 198. 0.000 0.9975

sigma 0.564679 RSS 30.9296673

R^2 0.997541 F(1,97) = 3.935e+004 [0.000]**

log-likelihood -82.8864 DW 2.28

no. of observations 99 no. of parameters 2

But is this regression spurious? Or is there a genuine long run relationship?

Any linear combination of I(1) variables is typically spurious.

However if there is a long-run relationship, errors have tendency to disappear and return to zero i.e. are I(0).

If there exists a relationship between two non stationary I(1) series, Yand X, such that the residuals of the regression

are stationary, then the variables in question are said to

be cointegratedThere is a Long Run Relationship towards which

they always come back

- +
- ut
- 0
- -
- Disequilibrium errors (i.e.ut = Yt - β0 - β1Xt)

No tendency to return to zero

Error rarely drifts from zero

If we have two independent non-stationary series, then we may find evidence of a relationship when none exists (i.e. spurious regression problem).

One way to test if there is a relationship between non- stationary data is if disequilibrium errorsreturn to zero.

If long run relationship exists then errors should be a stationary series and have a zero mean.

ut

0

Our regression between consumption and income:

How can we distinguish between a genuine long-run relationship and a spurious regression? We need a test.

- After estimating the model save residuals from static regression.
- (In PcGive after running regression click on Test and Store Residuals)
- Informally consider whether stationary.

(1) Cointegrating Regression Durbin Watson (CRDW) Test

- At 0.05 per cent significance level with sample size of 100,
- the critical value is equal to 0.38.
- Ho: DW = 0 => no cointegration (i.e. DW stat. is less than 0.38)
- Ha: DW > 0 => cointegration (i.e. DW stat. is greater than 0.38)
- Ho:ut = ut-1 + et
- Ha:ut = ρut-1 + et ρ < 1
- N.B. Assumes that the disequilibrium errors ut can be modelled by a first order AR process.
- Is this a valid assumption?
- May require a more complicated model.

First Test: Cointegrating Regression Durbin Watson Test (CRDW)

If the residual are non stationary, DW will go to 0 as the sample size

increases. So “large” values of DW are taken as evidence for

rejection of the null hypothesis of NO COINTEGRATION

EQ( 1) Modelling Y by OLS (using Lecture 6a.in7)

The estimation sample is: 1 to 99

Coefficient Std.Error t-value t-prob Part.R^2

Constant 4.85755 0.1375 35.3 0.000 0.9279

X 1.00792 0.005081 198. 0.000 0.9975

sigma 0.564679 RSS 30.9296673

R^2 0.997541 F(1,97) = 3.935e+004 [0.000]**

log-likelihood -82.8864 DW 2.28

no. of observations 99 no. of parameters 2

CRDW test statistic = 2.28 >> 0.38 = 5% critical value.

This suggests cointegration - assumes residuals follow AR(1) model.

Second Test: Cointegrating Regression DF test (CRDF)

1 - Perform the cointegrating regression

2 – Save the residuals,

3 – Run auxiliary regression:

Cointegrating Regression Dickey Fuller (CRDF) Test

- Δut = φ ut-1 + et
- Critical Values (CV) are from MacKinnon (1991)
- Ho: φ = 0 => no cointegration (i.e. TS is greater than CV)
- Ha: φ < 0 => cointegration (i.e. TS is less than CV)

Cointegrating Regression Dickey Fuller (CRDF) Test

- BETTER:
- Use lagged differenced terms to avoid serial correlation.
- Δut = φ ut-1 + θ1Δut-1 + θ2Δut-2 + θ3Δut-3 + θ4Δut-4 + et
- Use F-test of model reduction and also minimize Schwarz Information Criteria.
- Critical Values (CV) are from MacKinnon (1991)
- Ho: φ = 0 => no cointegration (i.e. TS is greater than CV)
- Ha: φ < 0 => cointegration (i.e. TS is less than CV)

- Using CRDF we incorporate lagged dependent variables into our regression
- Δut = φ ut-1 + θ1Δut-1 + θ2Δut-2 + θ3Δut-3 + θ4Δut-4 + et
- And then assess which lags should be incorporated using model reduction tests and Information Criteria.
- Progress to date
- Model T p log-likelihood SC HQ AIC
- EQ( 2) 94 5 OLS -74.238306 1.8212 1.7406 1.6859
- EQ( 3) 94 4 OLS -74.793519 1.7847 1.7202 1.6765
- EQ( 4) 94 3 OLS -74.797849 1.7364 1.6881 1.6553
- EQ( 5) 94 2 OLS -74.948145 1.6913 1.6591 1.6372
- EQ( 6) 94 1 OLS -75.845305 1.6621 1.6459 1.6350
- Tests of model reduction (please ensure models are nested for test validity)
- EQ( 2) --> EQ( 6): F(4,89) = 0.77392 [0.5450]
- EQ( 3) --> EQ( 6): F(3,90) = 0.67892 [0.5672]
- EQ( 4) --> EQ( 6): F(2,91) = 1.0254 [0.3628]
- EQ( 5) --> EQ( 6): F(1,92) = 1.7730 [0.1863]
- Consequently we choose Δut = φ ut-1 + et

All model reduction tests are accepted hence move to most simple model

Testing for Cointegration using CRADF test

- Step 1: Estimate cointegrating regression
- Yt = β0 + β1Xt + ut
- EQ( 1) Modelling Y by OLS (using Lecture 6a.in7)
- The estimation sample is: 1 to 99
- Coefficient Std.Error t-value t-prob Part.R^2
- Constant 4.85755 0.1375 35.3 0.000 0.9279
- X 1.00792 0.005081 198. 0.000 0.9975
- sigma 0.564679 RSS 30.9296673
- R^2 0.997541 F(1,97) = 3.935e+004 [0.000]**
- log-likelihood -82.8864 DW 2.28
- no. of observations 99 no. of parameters 2
- And save residuals.

- Step 2: Use estimated in the auxiliary regression model Δut = φut-1 + et
- EQ( 6) Modelling dresiduals by OLS (using Lecture6a.in7)
- The estimation sample is: 6 to 99
- Coefficient Std.Error t-value t-prob Part.R^2
- residuals_1 -1.16140 0.1024 -11.3 0.000 0.5805
- sigma 0.545133 RSS 27.6367834
- log-likelihood -75.8453 DW 1.95
- Which meansΔut = -1.161ut-1 + et
- (-11.3)
- CRDF test statistic = -11.3 << -3.39 = 5% Critical Value from MacKinnon.
- Hence we reject null of no cointegration between X and Y.

- Advantages of (CRDF) Test
- Engle and Granger (1987) compared alternative methods for testing for cointegration.
- (1) Critical values depend on the model used to simulated the data.
- CRDF was least model sensitive.
- (2) Also CRDF has greater power (i.e. most likely to reject a false null) compared to the CRDW test.

- Disadvantage of (CRDF) Test
- - Although the test performs well relative to CRDW test
- there is still evidence that CRDF have absolutely low power.
- Hence we should show caution in interpreting the results.

OLS estimates with I(0) variables are said to be consistent.

As the sample size increases they converge on their “true value”.

However if the true relationship between variables includes dynamic terms

Yt = θ0 + θ1Xt + θ2Yt-1+ θ3Xt-1 + ut

Static models estimated by OLS will be bias or inconsistent.

Yt = β0 + β1Xt + ut

Stock (1987) found that if Yt and Xt are cointegrated then OLS estimates of β0and β1 will be consistent.

Cointegration and Superconsistency

Indeed, Stock went further and suggested that estimated coefficients from cointegrated regressions will converge at a faster rate than normal.

i.e. super consistent.

Coefficients from a cointegrated regression are super consistent.

=> (i) simple static regression don’t necessarily give spurious results.

(ii) dynamic misspecification is not necessarily a problem.

Consequently we can estimate simple regression

Yt = β0 + β1Xt + ut

even if there are important dynamic terms

Yt = θ0 + θ1Xt + θ2Yt-1+ θ3Xt-1 + ut

Cointegration and Superconsistency

However, superconsistency is a large sample result.

Coefficients may be biased in finite samples (i.e. typical sample periods) due to omitted lagged values of Ytand Xt

Bias in static regressions is related to R2 .

A high R2 indicates that the bias will be smaller.

Testing for Cointegration: Summary

- To test whether two I(1) series are cointegrated we examine whether the residuals are I(0).
- (a) We firstly use informal methods to see if they are stationary
- (1) plot time series of residuals
- (2) plot correlogram of residuals
- (b) Two formal means of testing for cointegration.
- (1) CRDW - Cointegrating Regression Durbin Watson Test
- (2) CRDF - Cointegrating Regression Dickey Fuller Test

Download Presentation

Connecting to Server..