1 / 20

Nonparametric Divergence Estimators for Independent Subspace Analysis

Nonparametric Divergence Estimators for Independent Subspace Analysis. Barnabás Póczos (Carnegie Mellon University, USA) Zoltán Szabó (E ö tv ö s Lor á nd University, Hungary) Jeff Schneider (Carnegie Mellon University, USA). EUSIPCO‐2011 Barcelona, Spain Sept 2, 2011. Outline.

iola-valdez
Download Presentation

Nonparametric Divergence Estimators for Independent Subspace Analysis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Nonparametric Divergence Estimators for Independent Subspace Analysis Barnabás Póczos (Carnegie Mellon University, USA) Zoltán Szabó (Eötvös Loránd University, Hungary) Jeff Schneider (Carnegie Mellon University, USA) EUSIPCO‐2011 Barcelona, Spain Sept 2, 2011

  2. Outline • Goal:divergence estimation • Definitions, basic properties, motivation • The estimator • Theoretical results • Consistency • Experimental results • Mutual information estimation • Independent subspace analysis • Low-dimensional embedding of distributions

  3. Measuring divergences Manchester United 07/08 Owen Hargreaves Rio Ferdinand Cristiano Ronaldo KL Tsallis Rényi www.juhokim.com/projects.php

  4. Density: nuisance parameterDensity estimation: difficult How should we estimate them? • Naïve plug-in approach using density estimation • density estimators • histogram • kernel density estimation • k-nearest neighbors [D. Loftsgaarden & C. Quesenberry. 1965.] • How can we estimate them directly?

  5. kNN density estimation How good is this estimation? [D. Loftsgaarden and C. Quesenberry. 1965.] [N. Leonenko et. al. 2008]

  6. Divergence Estimation 6

  7. Asymptotically unbiased The estimator We need to prove: Agner Krarup Erlang 1-, and -1 moments of the “normalized k-NN distances” Normalized k-NN distances converge to the Erlang distribution 7

  8. Asymptotically unbiased If we could move the limit inside the expectation… All we need is

  9. Solutions: Asymptotically uniformly integrability… A little problem… Increases the paper length by another 20 pages…

  10. Results for divergence estimation 2D Normal 10

  11. Results for MI estimation rotated uniform distribution

  12. Independent Subspace Analysis Independent subspaces 6 by 6 mixing matrix Observation X=AS Estimate A and S observing samples from X only Goal: 12

  13. Independent Subspace Analysis Objective: 13

  14. Low dimensional embeddig of digits Noisy USPS datasets

  15. Embedding using raw image data

  16. Embedding using Rényi divergences

  17. Be careful, some mistakes are easy to make… We want: Helly–Bray theorem [Annals of Statistics]

  18. Fatou lemma: Erlang Fatou lemma: [Journal of Nonparametric Statistics, Problems Information Transmission, IEEE Trans. on Information Theory] Some mistakes … We want: Enough:

  19. Takeaways If you need to estimate divergences, then use me! • Consistent divergence estimator • Direct: no need to estimate densities • Simple: it needs only kNN based statistics • Can be used for mutual information estimation, independent subspace analysis, low-dimensional embedding Thanks for your attention! 

  20. Attic

More Related