econometrics l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Econometrics PowerPoint Presentation
Download Presentation
Econometrics

Loading in 2 Seconds...

play fullscreen
1 / 15

Econometrics - PowerPoint PPT Presentation


  • 190 Views
  • Uploaded on

Econometrics. Lecture Notes Hayashi, Chapter 6f Large Sample Theory. Sample Mean. Given a serially correlated process {y t }, what is the asymptotic properties of the sample mean? What are the restrictions on covariance stationary processes for the consistency of the sample mean?.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Econometrics' - zion


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
econometrics

Econometrics

Lecture Notes Hayashi, Chapter 6f

Large Sample Theory

sample mean
Sample Mean
  • Given a serially correlated process {yt}, what is the asymptotic properties of the sample mean?
  • What are the restrictions on covariance stationary processes for the consistency of the sample mean?
law of large number
Law of Large Number
  • Let {yt} be covariance stationary with mean m and autovariances {gj}. Then
law of large number4
Law of Large Number
  • The long-run variance is
  • In terms of autocovariance-generating function, the long-run variance is gy(z) at z=1 or 2p times the spectrum at frequency zero.
central limit theorem
Central Limit Theorem
  • CLT for MA()Let yt = m + j=0,…yjet-j where {et} is independent white noise and j=0,…|yj|< (ergodic stationarity). Then
central limit theorem6
Central Limit Theorem
  • Gordin’s condition for ergodic stationary process {yt}:
    • E(yt2) <  (strict stationarity)
    • E(yt|yt-j,yt-j-1,…) ms 0 as j   E(yt) = 0Conditional expectations (forecasts) will approach unconditional expectation, as less and less information becomes available.
central limit theorem7
Central Limit Theorem
  • Gordin’s condition (continued):
    • Let It = (yt,yt-1,yt-2,…), and writertj = E(yt|It-j) - E(yt|It-j-1). Define yt = j=0,… rtj (telescoping sum). Then j=0,…[E(rtj2)]1/2 < 
central limit theorem8
Central Limit Theorem
  • Gordin’s condition explained:
    • The revision of expectation about yt as the information set increases from It-j-1 to It-j, yt-(rt0+rt1+…+rtj-1 ) ms 0 as j  .
    • The telescoping sum indicates how the shocks represented by (rt0, rt1 , …) influence the current value of yt.
    • The shocks occurred a long time ago do not have disproportionately large influence. This condition restricts the extent of serial correlation in {yt}.
central limit theorem9
Central Limit Theorem
  • Gordin’s condition (example):
    • yt = f yt-1+et, |f|<1, {et} independent white noise with s2 = Var(et).
    • E(yt2) < 
    • E(yt|yt-j,yt-j-1,…) = fjyt-jms 0 as j  
    • rtj = fjyt-j–fj+1yt-j-1 = fj(yt-j– fyt-j-1) = fjet-j
    • yt = j=0,… rtj (telescoping sum) is MA()
    • j=0,…[E(rtj2)]1/2 = j=0,…|f|js = s/(1- |f|) < 
central limit theorem10
Central Limit Theorem
  • CLT for zero-mean ergodic stationary process: Suppose {yt} is stationary and ergodic and suppose Gordin’s condition is satisfied. Then E(yt) = 0, the autocovariance {gj} are absolutely summable, and
multivariate sample mean
Multivariate Sample Mean
  • The sample mean of a vector process {yt}:
  • if each diagonal element of Gj goes to zero as j  
multivariate sample mean12
Multivariate Sample Mean
  • (long-run covariance matrix of {yt}) equals j=-,…Gj if {Gj} is summable.
  • The long-run covariance matrix of {yt} can be written as Gy(1) = 2psy(0) = j=-,…Gj = G0 +j=1,…(Gj + Gj’)
multivariate sample mean13
Multivariate Sample Mean
  • If {yt} is vector MA() with absolutely summable coefficients and {et} is vector independent white noise,
multivariate sample mean14
Multivariate Sample Mean
  • Gordin’s condition on ergodic stationary process:
    • E(ytyt’) exists and is finite
    • E(yt|yt-j,yt-j-1,…) ms0 as j  
    • j=0,…[E(rtj’rtj)]1/2 is finite, wherertj = E(yt|yt-j,yt-j-1,…) - E(yt|yt-j-1,yt-j-2,…)
multivariate sample mean15
Multivariate Sample Mean
  • Suppose Gordin’s condition holds for vector ergodic stationary process {yt}. Then E(yt) = 0, {Gj} is absolutely summable, and