1 / 53

Advances in Random Matrix Theory (stochastic eigen analysis)

Discover the fascinating world of stochastic eigenanalysis and its applications to engineering and finance. Learn about the beautiful mathematics of random matrix theory and explore emerging computational algorithms and statistical techniques. Open questions and potential new applications await!

jmaxine
Download Presentation

Advances in Random Matrix Theory (stochastic eigen analysis)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Advances in Random Matrix Theory(stochastic eigenanalysis) Alan Edelman MIT: Dept of Mathematics, Computer Science AI Laboratories

  2. Stochastic Eigenanalysis Counterpart to stochastic differential equations Emphasis on applications to engineering & finance Beautiful mathematics: Random Matrix Theory Free Probability Raw Material from Physics Combinatorics Numerical Linear Algebra Multivariate Statistics

  3. Scalars, Vectors, Matrices • Mathematics: Notation = power & less ink! • Computation: Use those caches! • Statistics: Classical, Multivariate,  Modern Random Matrix Theory The Stochastic Eigenproblem * Mathematics of probabilistic linear algebra * Emerging Computational Algorithms * Emerging Statistical Techniques Ideas from numerical computation that stand the test of time are right for mathematics!

  4. Open Questions • Find new applications of spacing (or other) statistics • Cleanest derivation of Tracy-Widom? • “Finite” free probability? • Finite meets infinite • Muirhead meets Tracy-Widom • Software for stochastic eigen-analysis

  5. Wigner’s Semi-Circle • The classical & most famous rand eig theorem • Let S = random symmetric Gaussian • MATLAB: A=randn(n); S=( A+A’)/2; • S known as the Hermite Ensemble • Normalized eigenvalue histogram is a semi-circle • Precise statements require n etc.

  6. Wigner’s Semi-Circle • The classical & most famous rand eig theorem • Let S = random symmetric Gaussian • MATLAB: A=randn(n); S=( A+A’)/2; • S known as the Hermite Ensemble • Normalized eigenvalue histogram is a semi-circle • Precise statements require n etc. n x n iid standard normals

  7. Wigner’s Semi-Circle • The classical & most famous rand eig theorem • Let S = random symmetric Gaussian • MATLAB: A=randn(n); S=( A+A’)/2; • S known as the Hermite Ensemble • Normalized eigenvalue histogram is a semi-circle • Precise statements require n etc.

  8. Wigner’s original proof • Compute E(tr A2p) as n∞ • Terms with too many indices, have some element with power 1. Vanishes with mean 0. • Terms with too few indices: not enough to be relevant as n∞ • Leaves only a Catalan number left: Cp=(2p)/(p+1) for the moments when all is said and done • Semi-circle only distribution with Catalan number moments p

  9. Finite Versions of semicircle n=2; n=4; n=3; n=5;

  10. Finite Versions n=2; n=4; n=3; n=5; Area under curve (-∞,x): Can be expressed as sums of probabilities that certain tridiagonal determinants are positive.

  11. Wigner’s Semi-Circle • Real Numbers: x β=1 • Complex Numbers: x+iy β=2 • Quaternions: x+iy+jz+kw β=4 • β=2½? x+iy+jz β=2½? Defined through joint eigenvalue density: const x ∏|xi-xj|β ∏exp(-xi2 /2) β=repulsion strength β=0 “no interference” spacings are Poisson Classical research only β=1,2,4 missing the link to Poisson, continuous techniques, etc

  12. Largest eigenvalue “convection-diffusion?”

  13. Haar or not Haar? “Uniform Distribution on orthogonal matrices” Gram-Schmidt or [Q,R]=QR(randn(n))

  14. Haar or not Haar? “Uniform Distribution on orthogonal matrices” Gram-Schmidt or [Q,R]=QR(randn(n))  Eigenvalues Wrong

  15. Longest Increasing Subsequence(n=4) (Baik-Deift-Johansson) (Okounkov’s proof) Green: 4 Yellow: 3 Red: 2Purple: 1

  16. Bulk spacing statistics • Bus wait times in Mexico • Energy levels of heavy atoms • Parked Cars in London • Zeros of Riemann zeta • Mice Brain Wave Spikes “convection-diffusion?” Telltale Sign: Repulsion + optimality

  17. “what’s my β?”web page • Cy’s tricks: • Maximum Likelihood Estimation • Bayesian Probability • Kernel Density Estimation • Epanechnikov kernel • Confidence Intervals http://people.csail.mit.edu/cychan/BetaEstimator.html

  18. Open Questions • Find new applications of spacing (or other) distributions • Cleanest derivation of Tracy-Widom? • “Finite” free probability? • Finite meets infinite • Muirhead meets Tracy-Widom • Software for stochastic eigen-analysis

  19. 1 n2 d2 dx2 Everyone’s Favorite Tridiagonal … … … … …

  20. 1 n2 d2 dx2 dW β1/2 Everyone’s Favorite Tridiagonal 1 (βn)1/2 … … … + … … +

  21. 2 d 2 - + x dW , 2 dx β æ ö N(0,2) χ - ç ÷ (n 1) β χ N(0,2) χ ç ÷ - - (n 1) β (n 2) β 1 ç ÷ β H ~ , ç ÷ n 2 n β ç ÷ χ N(0,2) χ 2 β β ç ÷ χ N(0,2) è ø β 2 ¥ » + β H H G , n n n β Stochastic Operator Limit … … … Cast of characters: Dumitriu, Sutton, Rider

  22. Open Questions • Find new applications of spacing (or other) distributions • Cleanest derivation of Tracy-Widom? • “Finite” free probability? • Finite meets infinite • Muirhead meets Tracy-Widom • Software for stochastic eigen-analysis

  23. Is it really the random matrices? • The excitement is that the random matrix statistics are everyhwere • Random matrices properly tridiagonalized are discretizations of stochastic differential operators! • Eigenvalues of SDO’s not as well studied • Deep down this is what I believe is the important mechanism in the spacings, not the random matrices! (See Brian Sutton thesis, Brian Rider papers—connection to Schrodinger operators) • Deep down for other statistics, though it’s the matrices

  24. Open Questions • Find new applications of spacing (or other) distributions • Cleanest derivation of Tracy-Widom? • “Finite” free probability? • Finite meets infinite • Muirhead meets Tracy-Widom • Software for stochastic eigen-analysis

  25. Open Questions • Find new applications of spacing (or other) distributions • Cleanest derivation of Tracy-Widom? • “Finite” free probability? • Finite meets infinite • Muirhead meets Tracy-Widom • Software for stochastic eigen-analysis

  26. Free Probability • Free Probability (name refers to “free algebras” meaning no strings attached) • Gets us past Gaussian ensembles and Wishart Matrices

  27. The flipping coins example • Classical Probability: Coin: +1 or -1 with p=.5 50% 50% 50% 50% y: x: -1 +1 -1 +1 x+y: -2 0 +2

  28. The flipping coins example Free • Classical Probability: Coin: +1 or -1 with p=.5 50% 50% 50% 50% eig(B): eig(A): -1 +1 -1 +1 eig(A+QBQ’): -2 0 +2

  29. From Finite to Infinite

  30. From Finite to Infinite  Gaussian (m=1)

  31. From Finite to Infinite  Gaussian (m=1) Wiggly

  32. From Finite to Infinite  Gaussian (m=1) Wiggly Wigner

  33. Semi-circle law for different betas

  34. Open Questions • Find new applications of spacing (or other) distributions • Cleanest derivation of Tracy-Widom? • “Finite” free probability? • Finite meets infinite • Muirhead meets Tracy-Widom • Software for stochastic eigen-analysis

  35. Matrix Statistics • Many Worked out in 1950s and 1960s • Muirhead “Aspects of Multivariate Statistics” • Are two covariance matrices equal? • Does my matrix equal this matrix? • Is my matrix a multiple of the identity? • Answers Require Computation of • Hypergeometrics of Matrix Argument • Long thought Computationally Intractible

  36. The special functions of multivariate statistics • Hypergeometric Functions of Matrix Argument • β=2: Schur Polynomials • Other values: Jack Polynomials • Orthogonal Polynomials of Matrix Argument • Begin with w(x) on I • ∫ pκ(x)pλ(x) Δ(x)β∏i w(xi)dxi = δκλ • Jack Polynomials orthogonal for w=1 on the unit circle. Analogs of xm • Plamen Koev revolutionary computation • Dumitriu’s MOPS symbolic package

  37. Multivariate Orthogonal Polynomials&Hypergeometrics of Matrix Argument • The important special functions of the 21st century • Begin with w(x) on I • ∫ pκ(x)pλ(x) Δ(x)β∏i w(xi)dxi = δκλ • Jack Polynomials orthogonal for w=1 on the unit circle. Analogs of xm

  38. Smallest eigenvalue statistics A=randn(m,n); hist(min(svd(A).^2))

  39. Multivariate Hypergeometric Functions

  40. Multivariate Hypergeometric Functions

  41. Open Questions • Find new applications of spacing (or other) distributions • Cleanest derivation of Tracy-Widom? • “Finite” free probability? • Finite meets infinite • Muirhead meets Tracy-Widom • Software for stochastic eigen-analysis

  42. Plamen Koev’s clever idea

  43. Symbolic MOPS applications A=randn(n); S=(A+A’)/2; trace(S^4) det(S^3)

  44. Mops (Ioana Dumitriu) Symbolic

  45. Random Matrix Calculator

  46. Encoding the semicircleThe algebraic secret • f(x) = sqrt(4-x2)/(2π) • m(z) = (-z + i*sqrt(4-z2))/2 • L(m,z) ≡ m2+zm+1=0 m(z) = ∫ (x-z)-1f(x) dx Stieltjes transform Practical encoding: Polynomial L whose root m is Stieltjes transform

  47. The Polynomial Method • RMTool • http://arxiv.org/abs/math/0601389 • The polynomial method for random matrices • Eigenvectors as well!

  48. Plus + X =randn(n,n) A=X+X’ m2+zm+1=0 Y=randn(n,2n) B=Y*Y’ zm2+(2z-1)m+2=0 A+B m3+(z+2)m2+(2z-1)m+2=0

  49. Times * X =randn(n,n) A=X+X’ m2+zm+1=0 Y=randn(n,2n) B=Y*Y’ zm2+(2z-1)m+2=0 A*B m4z2-2m3z+m2+4mz+4=0

  50. Open Questions • Find new applications of spacing (or other) distributions • Cleanest derivation of Tracy-Widom? • “Finite” free probability? • Finite meets infinite • Muirhead meets Tracy-Widom • Software for stochastic eigen-analysis

More Related