1 / 37

Independent Component Analysis (ICA)

Independent Component Analysis (ICA). Adopted from: Independent Component Analysis: A Tutorial Aapo Hyvärinen and Erkki Oja Helsinki University of Technology. Motivation. Example: Cocktail-Party-Problem. Motivation. 2 s peaker s , speaking simultaneously. Motivation.

catkinson
Download Presentation

Independent Component Analysis (ICA)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Independent Component Analysis (ICA) Adopted from: Independent Component Analysis: A Tutorial Aapo Hyvärinen and Erkki Oja Helsinki University of Technology

  2. Motivation • Example: Cocktail-Party-Problem

  3. Motivation • 2 speakers, speakingsimultaneously.

  4. Motivation • 2 microphones in different locations

  5. Motivation aij ... depends on thedistances of themicrophones fromthespeakers

  6. Problem Definition • Getthe originalsignals out oftherecorded ones.

  7. Noise-free ICA model • Use statistical „latent variables“ system • Random variable sk instead of time signal • xj = aj1s1 + aj2s2 + .. + ajnsn, for all j x = As x = Sum(aisi) • ai ... basis functions • si ... independent components (IC‘s)

  8. Generative Model • IC‘s s are latent variables => unknown • Mixingmatrix A is also unknown • Task: estimate A and s using only the observeablerandomvector x

  9. Restrictions • siare statistically independent • p(y1,y2) = p(y1)p(y2) • Non-gaussian distributions • Note: if only one IC is gaussian, theestimation is still possible

  10. Solving theICAmodel • Additional assumptions: • # of IC‘s = # of observable mixtures • => A issquare and invertible • A is identifiable => estimateA • Compute W = A-1 • Obtain IC‘s from: s = Wx

  11. Ambiguities (I) • Can‘t determine the variances (energies) of the IC‘s • x =Sum[(1/Ci)aisiCi] • Fixmagnitudes of IC‘s assumingunit variance: E{si2} = 1 • Only ambiguity of sign remains

  12. Ambiguities (II) • Can‘t determine the order of the IC‘s • Termscan be freely interchanged, because both s and Aare unknown • x = AP-1Ps • P ... permutation matrix

  13. Centering the variables • Simplifying the algorithm: • Assume that both x and s have zeromean • Preprocessing: x = x‘ – E{x‘} • IC‘s are also zeromeanbecause of: E{s} = A-1E{x} • After ICA: add A-1E{x‘} to zero mean IC‘s

  14. Noisy ICA model x = As + n • A ... mxn mixing matrix • s ... n-dimensional vector of IC‘s • n ... m-dimensional randomnoisevector • Same assumptionsas for noise-free model

  15. General ICA model • Find a linear transformation: s = Wx • si as independent aspossible • Maximize F(s) : Measure ofindependence • Noassumptions ondata • Problem: • definition formeasure of independence • Strict independence is ingeneral impossible

  16. Illustration (I) • 2 IC‘s with distribution: • zero mean andvarianceequal to 1 • Joint distribution of IC‘s:

  17. Illustration (II) • Mixingmatrix: • Jointdistribution ofobserved mixtures:

  18. Other Problems • BlindSource/Signal Separation (BSS) • Cocktail Party Problem (anotherdefinition) • Electroencephalogram • Radar • MobileCommunication • Feature extraction • Image, Audio, Video, ...representation

  19. Principles of ICA Estimation • “Nongaussian is independent”: central limit theorem • Measure of nonguassianity • Kurtosis: (Kurtosis=0 for a gaussian distribution) • Negentropy: a gaussian variable has the largest entropy among all random variables of equal variance:

  20. Approximations of Negentropy (1)

  21. Approximations of Negentropy (2)

  22. The FastICA Algorithm

  23. 4 Signal BSS demo (original)

  24. 4 Signal BSS demo (Mixtures)

  25. 4 Signal BSS demo (ICA)

  26. FastICAdemo (mixtures)

  27. FastICAdemo (whitened)

  28. FastICAdemo (step 1)

  29. FastICA demo (step 2)

  30. FastICAdemo (step 3)

  31. FastICAdemo (step 4)

  32. FastICAdemo (step 5 - end)

  33. Other Algorithms for BSS • TemporalPredictability • TP ofmixture < TP of anysourcesignal • Maximize TP to seperatesignals • Works also onsignalswithGaussian PDF • CoBliSS • Works infrequencydomain • Onlyusingthecovariancematrix oftheobservation • JADE

  34. Links 1 • Feature extraction (Images, Video) • http://hlab.phys.rug.nl/demos/ica/ • Aapo Hyvarinen: ICA (1999) • http://www.cis.hut.fi/aapo/papers/NCS99web/node11.html • ICA demo step-by-step • http://www.cis.hut.fi/projects/ica/icademo/ • Lots of links • http://sound.media.mit.edu/~paris/ica.html

  35. Links 2 • object-based audio capture demos • http://www.media.mit.edu/~westner/sepdemo.html • Demo for BBS with „CoBliSS“ (wav-files) • http://www.esp.ele.tue.nl/onderzoek/daniels/BSS.html • Tomas Zeman‘s page on BSS research • http://ica.fun-thom.misto.cz/page3.html • Virtual Laboratories in Probability and Statistics • http://www.math.uah.edu/stat/index.html

  36. Links 3 • An efficient batch algorithm: JADE • http://www-sig.enst.fr/~cardoso/guidesepsou.html • Dr JV Stone: ICA and Temporal Predictability • http://www.shef.ac.uk/~pc1jvs/ • BBS with Degenerate Unmixing Estimation Technique (papers) • http://www.princeton.edu/~srickard/bss.html

  37. Links 4 • detailed information for scientists, engineers and industrials about ICA • http://www.cnl.salk.edu/~tewon/ica_cnl.html • FastICA package for matlab • http://www.cis.hut.fi/projects/ica/fastica/fp.shtml • Aapo Hyvärinen • http://www.cis.hut.fi/~aapo/ • Erkki Oja • http://www.cis.hut.fi/~oja/

More Related