1 / 30

Second order stationary p.p. dN(t) = ∫ exp{ i λ t} dZ N ( λ ) dt Kolmogorov

Second order stationary p.p. dN(t) = ∫ exp{ i λ t} dZ N ( λ ) dt Kolmogorov N(t) = Power spectrum cov{dZ N ( λ ), dZ N ( μ } = δ ( λ - μ ) f NN d λ d μ remember Z complex-valued Bivariate p.p. process {M,N} Coherency

thea
Download Presentation

Second order stationary p.p. dN(t) = ∫ exp{ i λ t} dZ N ( λ ) dt Kolmogorov

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Second order stationary p.p. dN(t) = ∫ exp{ i λ t} dZ N (λ ) dt Kolmogorov N(t) = Power spectrum cov{dZ N(λ ), dZ N (μ } = δ (λ - μ) f NN d λ d μ remember Z complex-valued Bivariate p.p. process {M,N} Coherency R MN (λ) = f MN / √{f MM f NN} cp. r Coherence | R MN (λ) | 2 cp. r2

  2. The Empirical FT. What is the large sample distribution of the EFT?

  3. The complex normal.

  4. Theorem. Suppose p.p. N is stationary mixing, then

  5. Proof. Write dN(t) = ∫ exp{ i λ t} dZ N (λ ) dt Kolmogorov then d NT(λ)= ∫ δT (λ – α) dZ N (α) with δ T (λ )= ∫ 0 T exp{ -i λ t} dt Evaluate first and second-order cumulants Bound higher cumulants Normal is determined by its moments

  6. Consider

  7. Comments. Already used to study rate estimate Tapering makes Get asymp independence for different frequencies The frequencies 2r/T are special, e.g. T(2r/T)=0, r 0 Also get asymp independence if consider separate stretches p-vector version involves p by p spectral density matrix fNN( )

  8. Estimation of the (power) spectrum. An estimate whose limit is a random variable

  9. Some moments. The estimate is asymptotically unbiased Final term drops out if  = 2r/T  0 Best to correct for mean, work with

  10. Periodogram values are asymptotically independent since dT values are - independent exponentials Use to form estimates

  11. Approximate marginal confidence intervals

  12. More on choice of L

  13. Approximation to bias

  14. Indirect estimate Could approximate p.p. by 0-1 time series and use t.s. estimate choice of cells?

  15. Estimation of finite dimensional . approximate likelihood (assuming IT values independent exponentials)

  16. Bivariate case.

  17. Crossperiodogram. Smoothed periodogram.

  18. Complex Wishart

  19. Predicting N via M

  20. Plug in estimates.

  21. Large sample distributions. var log|AT|  [|R|-2 -1]/L var argAT [|R|-2 -1]/L

  22. Networks. partial spectra - trivariate (M,N,O) Remove linear time invariant effect of M from N and O and examine coherency of residuals,

  23. Point process system. An operation carrying a point process, M, into another, N N = S[M] S = {input,mapping, output} Pr{dN(t)=1|M} = { + a(t-u)dM(u)}dt System identification: determining the charcteristics of a system from inputs and corresponding outputs {M(t),N(t);0 t<T} Like regression vs. bivariate

  24. Realizable case. a(u) = 0 u<0 A() is of special form e.g. N(t) = M(t-) arg A( ) = -

  25. Advantages of frequency domain approach. techniques formany stationary processes look the same approximate i.i.d sample values assessing models (character of departure) time varying variant ...

More Related