1 / 30

Chapter 18 Econometrics

Chapter 18 Econometrics. This series of slides will cover a subset of Chapter 18 Data and Operators Autocorrelated Lagged Variables Partial Adjustment. Repeated Firm or Consumer Data. Time Structured Data - [y 1 , y 2 , …, y t , …, y T ] Error Structure - Not Gauss-Markov (  2 I ).

omana
Download Presentation

Chapter 18 Econometrics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 18 Econometrics • This series of slides will cover a subset of Chapter 18 • Data and Operators • Autocorrelated • Lagged Variables • Partial Adjustment

  2. Repeated Firm or Consumer Data • Time Structured Data - [y1, y2, …, yt, …, yT] • Error Structure - Not Gauss-Markov (2I)

  3. Backshift Operator The backshift operator, B, by definition produces xt-1 from xt Bxt = xt-1 Of course, one can also say BBxt = B2xt = xt-2 In general, Bjxt = xt-j

  4. Autocorrelation Response Var Time Cov(yt, yt-1)?

  5. Table for Autocorrelation y1 y2 y3 y4 y5 y6 y7 y8 y1 y2 y3 y4 y5 y6 y7 y8

  6. Table for Autocorrelation y1 y2 y3 y4 y5 y6 y7 y8 y1 y2 y3 y4 y5 y6 y7 y8

  7. Autocorrelated Error et = et-1 + t ~ N(0,2I)

  8. Recursive Substitution in Time Series et = et-1 + t = (et-2 + t-1) + t = [(et-3 + t-2) + t-1] + t

  9. Now We Leverage the Pattern et = [(et-3 + t-2) + t-1] + t = t + t-1 + 2t-2 + 3t-3 + … =

  10. Time to Figure Out E(·)

  11. And Now Of course V(·) V(et) = E[et - E(et)]2 V(et) = E[et2] The previous slide showed that E(et) = 0

  12. Now We Use the Pattern (Squared)

  13. A Big Mess, Right? V(et) = E(et2) = (1 + 2 + 4 + …)2 Uh-oh… an infinite series…

  14. Let’s Define the Infinite Series “s” s = 1 + 2 + 4 + 8 + … 2s = 2 + 4 + 8 + 16 + … What is the difference between the first and second lines? s - 2s = 1

  15. Putting It Together Since

  16. Applying the Same Logic to the Covariances For any pair of errors one time unit apart we have and in general

  17. Instead of the Gauss-Markov Assumption (2I) we have V(e) = So how do we estimate  now?

  18. Lagged Independent Variables Consumer behavior and attitude do not immediately change: yt = 0 + xt-11 + et Or more generally: yt = 0 + xt-11 + xt-22 + ··· + et

  19. Koyck’s Scheme Koyck started with the infinite sequence yt = xt0 + xt-11 + xt-22 + ··· + et and assumed that the  values are all of the same sign

  20. i i i s s s 0 0 0 i i i Lagged effects can take on many forms: Koyck (and others) have come up with ways of estimating different shaped impacts (1) assuming that only s lag positions really matter, and that (2) the impact of x on y takes on some sort of curved pattern as above

  21. Further Assumptions • How many lags matter? In other words, how far back do we really need to go? Call that s. • Can we express the impact of those s lags with an even fewer number of unknowns. Any pattern can be approximated with a polynomial of degree r  s (Almon’s Scheme). In Koyck’s Scheme, we will use a geometric rather than polynomial pattern.

  22. We Rewrite the Model Slightly where wi 0 for i = 0, 1, 2, ···,  and

  23. Bring in the Backshift Operator and Assume a Geometric Pattern for the wi Now we assume that wi = (1 - )i 0 <  < 1

  24. Given Those Assumptions Anyone care to say how we got to this fraction?

  25. Substitute That into the Equation for yt

  26. Adaptive Adjustment Define as the expected level of x (prices, availability, quality, outcome)… So consumer behavior should look like

  27. Updating Process Expectations are updated by a fraction of the discrepancy between the current observation and the previous expectation

  28. Redefine  in Terms of a New Parameter  Define  = 1 -  so that

  29. More Algebra

  30. Back to the Model for yt We end up at the same place as slide 25

More Related