1 / 39

Laboratory in Oceanography: Data and Methods

Laboratory in Oceanography: Data and Methods. Empirical Orthogonal Functions. MAR599, Spring 2009 Anne-Marie E.G. Brunner-Suzuki. Motivation. Miles’s class Distinguish patterns/noise Reduce dimensionality Prediction Smoothing. The Goal. Separate time and space of the data:

vern
Download Presentation

Laboratory in Oceanography: Data and Methods

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Laboratory in Oceanography: Data and Methods Empirical Orthogonal Functions MAR599, Spring 2009 Anne-Marie E.G. Brunner-Suzuki

  2. Motivation • Miles’s class • Distinguish patterns/noise • Reduce dimensionality • Prediction • Smoothing

  3. The Goal • Separate time and space of the data: • Filter out the noise and reveal “hidden” structure

  4. Matlab Example 1

  5. Data • t: time; xt is one “map” in time. • There are n timesteps and p different measurements at each timestep.

  6. Matlab Example 2 – artificial signal

  7. Summary • EOF let’s us separate an ensemble of data into k different modes. Each mode has a ‘space’ (EOF=u) and ‘time’ (EC =c) component • Pre-treating the data can be useful in finding “hidding” structures (taking out the temporal/spatial mean) • But all the information is contained in the data • It is “just” a mathematical construct. We, the researchers, are responsible for finding appropriate explanations.

  8. Naming convention • Empirical Orthogonal Functions Analysis • Principal Component Analysis • Discrete Karhunen–Loève Functions • Hotelling transform • Proper orthogonal decomposition

  9. How to deal with gaps? • Ignore them; leave them be. • Introduce randomly generated data to fill gaps and test for M realizations • Fill the gaps in each data series using e.g. optimal interpolation

  10. Next time • Some math: • What happens inside the black box? • How do we know how many modes are significant? • Some problems and pitfalls • More advanced EOF • Matlab’s own function

  11. References • Preisendorfer • Storch • Hannachi

  12. Laboratory in Oceanography: Data and Methods Empirical Orthogonal Functions Part II MAR599, Spring 2009 Anne-Marie E.G. Brunner-Suzuki

  13. Pre-treating the data X: Shaping the data set: • X has n timesteps and p different measurements. • [n,p] = size(X); • Use ‘reshape’ to convert from 3D to 2D: • X=reshape(X3D, [nx*ny ntimes]); • Remove the mean from the data, so each column (=timeseries) has zero mean: X=detrend(X,o);

  14. How to do it. • Form a covariance matrix: • Solve the eigenvalue Problem: Cx R = R Λ. • Λ is a diagonal matrix containing all the eigenvalues λof Cx. • The columns ri in R are the eigenvectors of Cx. Each corresponding to its λi. • We pick the ri to be our EOF patterns: R= EOFs • We arrange the: λ1 > λ2….> λp and the ri correspondingly.

  15. Eigenvectors & Eigenvalues • Cx R = R Λ • Here, R is a set of vectors, that are transformed by Cx into the same vectors except a multiplication factor Λ. R changes in length, but not in direction. • These R are called eigenvectors. The Λ are called eigenvalues. • Also, because Cx is hermitian (diagonally symmetric: Cx’=Cx) and Cx has rank p, there will be p eigenvectors. • Eigenvectors are always orthogonal.

  16. All EOFs explain 100% of the variance. Each mode explains part of the total variance. • All eigenvectors are orthogonal to each other; Hence Empiriral ORTHOGONAL Functions. • To see how the EOFs evolve in time, we compute the ‘expansion coefficients ‘or amplitudes: ECi = X EOFi;

  17. In Matlab: • Shape your data into time x space • Demean your data: X = detrend (X,o); • Compute the Covariance: Cx = cov(X); • Compute Eigenvectors, Eigenvalues: [EOFs, l] = eig(Cx); • Sort according to size. Matlab sorts in ascending order. • Compute EC: EC1 = X * EOFs(: , 1); • Compute variance explained: Var_expl = diag(l)/trace(l);

  18. Normalization • Often the EOFs are normalized, so that highest value is 1 or 100. As X = EOF *EC the EC will need to be adjusted correspondingly, as has to be valid.

  19. How to understand this? • Let’s assume we only have 2 samples xa and ya that evolve in time: • If the all observations are random, there would be a blob in space. Any regularities would show up as directionalities in the blob. • EOF Analysis aims to find these new directionalities, by defining a new coordinate system, where the new axis goes right along these dimensionalities

  20. With p observations, we have p-dimensional space, and hence we want to find every cluster, by laying a new coordinate system (basis) through the data. • EOF method takes all the variability in a time evolving field and breaks it into a (a few) standing oscillations and a time series to go with each oscillation. The EC show how the EOF modes vary in time.

  21. A word about removing the mean Removing the time means has nothing to do with the process of finding eigenvectors, but it allows us to interpret Cx as a covariance matrix, and hence, we can understand our results. Strictly speaking one can find EOFs without removing any mean.

  22. EOF via SVD • SVD : Singular Value Decomposition • It decomposes any n x p matrix X into the form: • X = U S V’, • U is a n x n orthonormal matrix • S is a diagnoal n x p matrix with si,i elements on the diagonal. s are called singular values. • The columns of U and V contain the singular vectors of X.

  23. Connecting SVD and EOF • X is the demeaned data matrix as before. • Cx = X’X = (U S V’)’ (U S V’) = VS’ U’ U S V’ = V S’S V’ • Cx = EOFs ΛEOFs’ (rewritten eigenvalue problem) • Comparing 1. & 2.: • EOFs = V (at least almost) • Λ = S’ S: the squared singular values are the eigenvalues. • The columns of V contain the eigenvectors of Cx= X’ X; our EOFs. • The columns of U contain the eigenvectors of X’ X. Which is also the normalized time series.

  24. How to do it • Use SVD to find U S and V such that X = U S V’ • Compute the eigenvalues of Cx. • The eigenvectors of Cx are the column vectors of V. • We never have to actually compute Cx!

  25. In Matlab • Shape your data into time x space • Demean your data: X = detrend (X,o); • Perform SVD: [ U, S, V ] = svd(X); • Compue Eigenvalues: EVal = diag(S.^2); • Compute explained variance: expl_var = EVal/sum(EVal); • EOFs are the column vectors of V’: EOFs = V’; • Compute Expansion Coefficients: EC = U*S;

  26. The two techniques • There are basically two techniques: • Computing Eigenvector and Eigenvalues of the Covariance Matrix • Singular Value Decomposition (SVD) of the data. • Both Methods give similar results. Check it out! • However, • There are some differences in dimesionality. • SVD is much faster – especially when your data are above 1000 x 1000 points.

  27. Testing Domain Dependency • If the first EOF is unimodal, the second bimodal, the EOF analysis might be domain dependent. • Testing: • Split your domain into two sections (e.g. North and South) • Repeat EOF for each domain • Are the same results (unimodal and bi-modal structures) are obtained for each sub-domain? • If yes: The EOF analysis is domain dependent. • Interpretation becomes difficult or impossible • A possibly solution are “rotated EOFs” (REOF): • After a EOF analysis some of the Eigenvectors are rotated.

  28. EOF from Hannachi example • Winter (DJF) monthly SLP over the Northern Hemisphere (NH) from NCEP/NCAR reanalyses January 1948 to December 2000. • The mean annual cycle was removed

  29. Positive contours solid, negative contours dashed. EOFs have been multiplied by 100.

  30. Selection Rules • Visual.

  31. North’s Rule of Thumb • North et al defined “typical errors” between two neighboring eigenvalues λ: • “typical errors” between neighboring eigenvectors ψ: • n is the number of degrees of freedom, which is generally less than the number of data points. • Are two modes two close, they are called degenerate.

  32. ComplexEOF • Allows to analyze propagating signals. • Analyze a set of time series by creating a phase lag among between them by adding a 90degree phase shift. This is done in complex space using the Hilbert transform. • Is cool technique, but pretty complex.

  33. Monte Carlo • Create surrogate data – a randomized data set by scrambling the monthly maps in the time domain, in order to break the chronological order. • Compute EOF of scrambled dataset and analyze EOFs.

  34. Matlab’s own functions • PRINCOMP • [COEFF,SCORE,latent] = princomp(X) • [EOFs,EC, EigVal] = princomp (data); • The EOFs are columns and so are the ECs. • PCACOV • [COEFF,latent,explained] = pcacov(V); • [EOFs, EigVal, expl_var] = pcacov(data); • I believe this uses svd

  35. Assumptions we made • Orthogonal • Normal distributed data • High signal to noise ratio • Standing Patterns only • “The mean” • Problems that might occur: • No physical interpretation possible • Degenerate Modes • Domain Dependency

  36. A warning from von Storch and Navarra: • “I have learned the following rule to be useful when dealing with advanced methods. Such methods are often needed to find a signal in a vast noisy space, i.e. the needle in the haystack. But after having the needle in our hand, we should be able to identify the needle by simply looking at it. Whenever you are unable to do so there is a good chance that something is rotten in the analysis.”

  37. References • R. W. Preisendorfer. Principal component analysis in meteorology and oceanography. Elsevier. Science, 1988 • Hans v. Storch and Francis W. Zwiers: Statistical Analysis in Climate Research. Cambridge University Press, 2002. • North, G.R., T.L. Bell, R.F. Cahalan, and F.J. Moeng, Sampling errors in the estimation of empirical orthogonal functions, Mon. Wea. Rev., 110, 699-706, 1982. • Hannachi, A., I. T. Jolliffe and D. B. Stephenson: Empirical orthogonal functions and related techniques in atmospheric science: A review. International Journal of Climatology, 27, 1119–1152, 2007.

More Related