1 / 84

Econ 240C

Econ 240C. Lecture 15. ARCH-GARCH Structure?. Part I. Conditional Heteroskedasticity. An Example. Producer Price Index for Finished Goods April 1947-April 2003 1982=100 Seasonally adjusted rate (SAR). Transformations. PPI is evolutionary Take logarithms Then difference

blondelle
Download Presentation

Econ 240C

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Econ 240C Lecture 15

  2. ARCH-GARCH Structure?

  3. Part I. Conditional Heteroskedasticity • An Example

  4. Producer Price Index for Finished Goods • April 1947-April 2003 • 1982=100 • Seasonally adjusted rate (SAR)

  5. Transformations • PPI is evolutionary • Take logarithms • Then difference • Obtain the fractional changes, i.e. the inflation rate for producer goods

  6. Modeling dlnppi • Try an artwo

  7. Residuals from ARTWO Model

  8. Modeling dlnppi • Try an ARMA(1,1)

  9. Histogram of Residuals from ARMA(1,1) Model of dlnppi

  10. ARMA(1,1) Model of Producer Goods Inflation • Residuals from ARMA(1, 1) model are orthogonal but not normal • Are we done?

  11. Part II. Examine Residuals • Trace of residuals • Trace of square of residuals

  12. Residsq=resid*resid

  13. Episodic variance • Not homoskedastic • So call heteroskedastic, conditional on dates or episodes when the variance kicks up • Hence name “conditional heteroskedaticity”

  14. Clues • Check trace of residuals squared • Check correlogram of residuals squared

  15. Clues • Check trace of residuals squared • can get residuals from Actual, fitted, residuals table • Check correlogram of residuals squared • EVIEWS option along with correlogram of residuals • Heteroskedasticity of residuals • Histogram of residuals • kurtotic residuals are a clue

  16. How should we model the conditional heteroskedasticity?

  17. Part III: Modeling Conditional Heteroskedasticity • Robert Engle: UCSD • Autoregressive error variance model

  18. Modeling the error • Model the error e(t) as the product of two independent parts, wn(t) and h(t) • WN(t) ~N(0,1)

  19. Modeling the error • Assume that WN(t) is independent of • So density f{wn(t)*[h(t)]1/2} is the product of two densities, g and k: • f =g[wn(t)]*k{[h(t)]1/2} • And expectations can be written as products of expectations • This is related to writing the Probability of P(A and B) as P(A)*P(B) if events A • And B are independent

  20. Modeling the error • We would like the error, e(t) to have the usual properties of mean zero, orthogonality, and unconditional variance constant • E e(t) = E {[h(t)]1/2*WN(t)} = E{[h(t)]1/2}*E[WN(t)] , the product of expectations because of independence • We may not know E{[h(t)]1/2}, but we know E[WN(t)] =0 so Ee(t)=0

  21. Modeling the error, e(t) • In a similar fashion, • E[e(t)*e(t-1)] = E( {[h(t)]1/2*WN(t)} * {[h(t-1)]1/2*WN(t-1)}) • =E {[h(t)]1/2*[h(t-1)]1/2} *E[WN(t)*WN(t-1)] • We may not know the first expectation but we know the second is zero since white noise is orthogonal, so e(t) is also orthogonal, i.e. E[e(t)*e(t-1)] = 0

  22. Modeling the error, e(t) • The unconditional variance of e(t), E[e(t)]2 = E{[wn(t)]2 *h(t)]} • And once again by independence, E[e(t)]2 = E[wn(t)]2 *E[h(t)] • Where the first expectation is 1, since white noise has variance one and Eh(t) =E{a0 + a1 [e(t-1)]2 } • So E[e(t)]2 = a0 + a1 E[e(t-1)]2 • And since E[e(t)]2 = E[e(t-1)]2 =Var • Var e(t) = a0 /(1- a1 ), a constant

  23. Modeling the error, e(t) • The conditional mean is the expected value of e(t) at time t-1: • Et-1 e(t) = Et-1 {wn(t)*[h(t)]1/2 } • And by independence Et-1 e(t) = Et-1 [wn(t)]*Et-1 [h(t)]1/2 • Our best guess at time t-1 of the shock wn(t) next period is zero so our best guess of the shock e(t) for next period is also zero

  24. Modeling the error, e(t) • The conditional variance of e(t) is • Et-1 [e(t)]2 = Et-1 {[wn(t)]2 *h(t)} and by independence • Et-1 [e(t)]2 = Et-1 [wn(t)]2 *Et-1 h(t) and the first conditional expectation is 1, and the second is Et-1 h(t) = Et-1 {a0 + a1 [e(t-1)]2} = a0 + a1 Et-1 [e(t-1)]2 = a0 + a1 [e(t-1)]2 • So Et-1 [e(t)]2 = a0 + a1 [e(t-1)]2 = h(t) • i.e. the conditional variance is h(t), and depends on the error squared from the previous period, the autoregressive feature

  25. Generalizations • Autoregressive conditional heteroskedasticity, or ARCH, can be extended with more lagged terms: • h(t) = a0 + a1 {e(t-1)]2 + a2 [e(t-2)]2 + … • The conditional variance h(t), can also be modeled by adding lagged conditional variance terms: • h(t) = a0 + a1 {e(t-1)]2 + a2 [e(t-2)]2 + …b1h(t-1) • + …. • This extension was suggested by Bollerslev and is called GARCH, generalized ARCH

  26. Part IV: Estimation of ARCH and Garch

  27. From the correlogram of the squared residuals from the ARMA(1, 1) model for dlnppi, it looks like there may be AR and MA structure

  28. Correlogram of squared residuals

  29. EVIEWS Equation Window: Estimation Method

  30. Diagnostics • “Residuals”, estimate of e(t) • Actual, fitted and residuals • GENR resgar = resid • “Standardized residuals”, estimate of wn(t) = e(t)/[h(t)]1/2 • Correlogram of residuals • Histogram of residuals • Correlogram of residuals squared • Where do we find estimate of h(t) • PROCS menu in the equation window, make GARCH variance series

  31. Actual, fitted, residuals from GARCH Model

More Related