1 / 29

A PTAS for Computing the Supremum of Gaussian Processes

A PTAS for Computing the Supremum of Gaussian Processes. Raghu Meka (IAS/DIMACS). Gaussian Processes (GPs). Jointly Gaussian variables : Any finite sum is Gaussian. Supremum of Gaussian Processes (GPs). Given want to study. Why Gaussian Processes?. Stochastic Processes

erik
Download Presentation

A PTAS for Computing the Supremum of Gaussian Processes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A PTAS for Computing the Supremum of Gaussian Processes Raghu Meka (IAS/DIMACS)

  2. Gaussian Processes (GPs) • Jointly Gaussian variables : • Any finite sum is Gaussian

  3. Supremum of Gaussian Processes (GPs) Given want to study

  4. Why Gaussian Processes? Stochastic Processes Functional analysis Convex Geometry Machine Learning Many more!

  5. Cover times of Graphs Aldous-Fill 94: Compute cover time deterministically? Fundamental graph parameter Eg: • KKLV00: approximation • Feige-Zeitouni’09: FPTAS for trees

  6. Cover Times and GPs • Transfer to GPs • Compute supremum of GP Thm (Ding, Lee, Peres 10): O(1) det. poly. time approximation for cover time. Thm (DLP10): Winkler-Zuckerman “blanket-time” conjectures.

  7. Computing the Supremum Question (Lee10, Ding11): PTAS for computing the supremum of GPs? Question (Lee10, Ding11): Given , compute a factor approx. to • Covariance matrix • More intuitive Random Gaussian

  8. Computing the Supremum Question (Lee10, Ding11): Given , compute a factor approx. to • DLP10: O(1) factor approximation • Can’t beat O(1): Talagrand’smajorizing measures

  9. Main Result Thm: A PTAS for computing the supremum of Gaussian processes. Thm: PTAS for computing cover time of bounded degree graphs. Thm: Given , a det. algorithm to compute approx. to Comparison inequalities from convex geometry

  10. Outline of Algorithm 1. Dimension reduction • Slepian’s Lemma, Johnson-Lindenstrauss 2. Optimal eps-nets in Gaussian space • Kanter’s lemma, univariate to multivariate

  11. Dimension Reduction Idea: JL projection, solve in projected space Use deterministic JL – EIO02, S02. • , . V W

  12. Analysis: Slepian’s Lemma Problem: Relate supremum of projections

  13. Analysis: Slepian’s Lemma • Enough to solve for W • Enough to be exp. in dimension

  14. Outline of Algorithm 1. Dimension reduction • Slepian’s Lemma, Johnson-Lindenstrauss 2. Optimal eps-nets in Gaussian space • Kanter’s lemma, univariate to multivariate

  15. Nets in Gaussian Space • Goal: , in time approximate • We solve the problem for all semi-norms

  16. Nets in Gaussian space • Discrete approximations of Gaussian Main thm: Explicit -net of size . Explicit • Integer rounding: (need granularity ) • Dadusch-Vempala’12: Optimal: Matching lowerbound

  17. Construction of eps-net • Simplest possible: univariate to multivariate What resolution? Naïve: . How far out on the axes?

  18. Construction of eps-net • Analyze ‘step-wise’ approximator Even out mass in interval . -

  19. Construction of eps-net • Take univariate net and lift to multivariate Main Lemma: Can take What resolution enough? What resolution? Naïve: . How far out on the axes? -

  20. Dimension Free Error Bounds Lem: For , a norm, • Proof by “sandwiching” • Exploit convexity critically -

  21. Analysis of Error Def: Sym. (less peaked), if sym. convex sets K • Why interesting? For any norm,

  22. Sandwiching and Lifting Nets Fact: Proof: Spreading away from origin! -

  23. Sandwiching and Lifting Nets Fact: By definition, Cor: By Kanter’s lemma, Kanter’s Lemma(77): and unimodal, Cor: Upper bound,

  24. Sandwiching and Lifting Nets Fact: Proof: For inward push compensates earlier spreading. • Def: scaled down version of • , , pdf of . Push mass towards origin.

  25. Sandwiching and Lifting Nets Fact: By definition, Cor: By Kanter’s lemma, Kanter’s Lemma(77): and unimodal, Cor: Lower bound,

  26. Sandwiching and Lifting Nets Combining both:

  27. Outline of Algorithm 1. Dimension reduction • Slepian’s Lemma 2. Optimal eps-nets for Gaussians • Kanter’s lemma PTAS for Supremum

  28. Open Problems • FPTAS for computing supremum? • Black-box algorithms? • JL step looks at points • PTAS for cover time on all graphs? • Conjecture of Ding, Lee, Peres 10

  29. Thank you

More Related