1 / 39

Meet the professor

Meet the professor. Friday, January 23 at SFU 4:30 Beer and snacks reception. Spatial Covariance. NRCSE. Valid covariance functions. Bochner’s theorem: The class of covariance functions is the class of positive definite functions C: Why?. Spectral representation.

finola
Download Presentation

Meet the professor

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Meet the professor • Friday, January 23 at SFU • 4:30 Beer and snacks reception

  2. Spatial Covariance NRCSE

  3. Valid covariance functions • Bochner’s theorem: The class of covariance functions is the class of positive definite functions C: • Why?

  4. Spectral representation • By the spectral representation any isotropic continuous correlation on Rd is of the form • By isotropy, the expectation depends only on the distribution G of . • Let Y be uniform on the unit sphere. Then

  5. Isotropic correlation • Jv(u) is a Bessel function of the first kind and order v. • Hence • and in the case d=2 • (Hankel transform)

  6. The Bessel function J0 –0.403

  7. The exponential correlation • A commonly used correlation function is (v) = e–v/. Corresponds to a Gaussian process with continuous but not differentiable sample paths. • More generally, (v) = c(v=0) + (1-c)e–v/ has a nugget c, corresponding to measurement error and spatial correlation at small distances. • All isotropic correlations are a mixture of a nugget and a continuous isotropic correlation.

  8. The squared exponential • Using yields • corresponding to an underlying Gaussian field with analytic paths. • This is sometimes called the Gaussian covariance, for no really good reason. • A generalization is the power(ed) exponential correlation function,

  9. The spherical • Corresponding variogram nugget sill range

  10. The Matérn class • where is a modified Bessel function of the third kind and order . It corresponds to a spatial field with [–1] continuous derivatives • =1/2 is exponential; • =1 is Whittle’s spatial correlation; • yields squared exponential.

  11. Some other covariance/variogram families

  12. Estimation of variograms • Recall • Method of moments: square of all pairwise differences, smoothed over lag bins • Problems: Not necessarily a valid variogram • Not very robust

  13. A robust empirical variogram estimator • (Z(x)-Z(y))2 is chi-squared for Gaussian data • Fourth root is variance stabilizing • Cressie and Hawkins:

  14. Least squares • Minimize • Alternatives: • fourth root transformation • weighting by 1/2 • generalized least squares

  15. Maximum likelihood • Z~Nn(,) = [(si-sj;)] =  V() • Maximize • and q maximizes the profile likelihood

  16. A peculiar ml fit

  17. Some more fits

  18. All together now...

  19. Asymptotics • Increasing domain asymptotics: let region of interest grow. Station density stays the same • Bad estimation at short distances, but effectively independent blocks far apart • Infill asymptotics: let station density grow, keeping region fixed. • Good estimates at short distances. No effectively independent blocks, so technically trickier

  20. Stein’s result • Covariance functions C0 and C1 are compatible if their Gaussian measures are mutually absolutely continuous. Sample at {si, i=1,...,n}, predict at s (limit point of sampling points). Let ei(n) be kriging prediction error at s for Ci, and V0 the variance under C0 of some random variable. • If limnV0(e0(n))=0, then

  21. The Fourier transform

  22. Properties of Fourier transforms • Convolution • Scaling • Translation

  23. Parceval’s theorem • Relates space integration to frequency integration. Decomposes variability.

  24. Aliasing • Observe field at lattice of spacing . Since • the frequencies  and ’=+2m/are aliases of each other, and indistinguishable. • The highest distinguishable frequency is , the Nyquist frequency.

  25. Illustration of aliasing • Aliasing applet

  26. Spectral representation • Stationary processes • Spectral process Y has stationary increments • If F has a density f, it is called the spectral density.

  27. Estimating the spectrum • For process observed on nxn grid, estimate spectrum by periodogram • Equivalent to DFT of sample covariance

  28. Properties of the periodogram • Periodogram values at Fourier frequencies (j,k)are • uncorrelated • asymptotically unbiased • not consistent • To get a consistent estimate of the spectrum, smooth over nearby frequencies

  29. Some common isotropic spectra • Squared exponential • Matérn

  30. A simulated process

  31. Thetford canopy heights • 39-year thinned commercial plantation of Scots pine in Thetford Forest, UK • Density 1000 trees/ha • 36m x 120m area surveyed for crown height • Focus on 32 x 32 subset

  32. Spectrum of canopy heights

  33. Whittle likelihood • Approximation to Gaussian likelihood using periodogram: • where the sum is over Fourier frequencies, avoiding 0, and f is the spectral density • Takes O(N logN) operations to calculate • instead of O(N3).

  34. Using non-gridded data • Consider • where • Then Y is stationary with spectral density • Viewing Y as a lattice process, it has spectral density

  35. Estimation • Let • where Jx is the grid square with center x and nx is the number of sites in the square. Define the tapered periodogram • where . The Whittle likelihood is approximately

  36. A simulated example

  37. Estimated variogram

More Related