1 / 15

Blind Signal Separation in the Presence of Gaussian Noise

Blind Signal Separation in the Presence of Gaussian Noise. Mikhail Belkin Luis Rademacher James Voss Ohio State University. Cocktail Party Problem (Example). Problem: persons speaking in a room with microphones. Microphones capture a superposition of the speech signals.

season
Download Presentation

Blind Signal Separation in the Presence of Gaussian Noise

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Blind Signal Separation in the Presence of Gaussian Noise Mikhail Belkin Luis Rademacher James Voss Ohio State University

  2. Cocktail Party Problem (Example) • Problem: persons speaking in a room with microphones. • Microphones capture a superposition of the speech signals. • Goal: Recover each persons’ speech.

  3. Independent Component Analysis (ICA) • Observe samples from a random vector • is a latent random vector with independent coordinates. • Coordinates variables are non-Gaussian. • Assumed . • and is full rank. • Columns form a spanning basis of • acts in the direction . • is an additive noise. • Traditionally assumed 0. • We model as unknown, Gaussian noise independent of . • Goal: Recover .

  4. Typical (noiseless) ICA Procedure • Step 1: Whiten data to have covariance . • Data is left multiplied by a matrix such that (a rotation). • Orthogonalizes the latent signals. • Recovers up to a rotation matrix. • Step 2: Find the rotation. Recover Rotation Whiten Raw Data Demixed Data Data mixed from Uniform Distributions

  5. Relaxing Step 1 Whitened Data • Whitening the latent signal is impossible in the noisy case. • Let be a white, Gaussian r.v. • vs. . • Definition: If where is a rotation matrix, and is a diagonal scaling matrix, then is a quasi-whitening matrix. • Recall model: • Applying to data orthogonalizes the latent components. Quasi-Whitened Data

  6. Related Work (efficient noisy ICA) • AapoHyvärinen (1999) discusses noisy ICA when the noise covariance is known. • ArieYeredor (2000) provides a one-step solution to noisy ICA using the Hessian of the directional 2nd Characteristic Function. • Arora, Ge, Moitra, and Sachdeva (2012) introduced quasi-whitening and provide an efficient noisy ICA algorithm for the special case where all latent signals have fourth cumulant of the same sign. • Hsu and Kakade (2012) state a one-step solution to noisy ICA using the Hessian of the directional fourth cumulant.

  7. Our Contribution • We introduce an efficient quasi-whitening algorithm for noisy ICA with latent signals of non-zero fourth cumulants (possibly of mixed sign). • Relies on multivariate cumulant tensors. • Compatible with variations of existing methods for Step 2. • (central moments). • Restricting to the unit sphere,the local maxima of give the columns of .

  8. What are Cumulants? • Cumulants are functions of random variables, similar to moments. • Respect independence more nicely than moments. • Low order cumulants: mean, variance • Have natural sample versions (k-statistics). • Let denote the cross-cumulant between random variables and . • Let denote the 4D cumulant tensor with entries: • gives the fourth univariatecumulant.

  9. Properties of Multivariate Cumulants(stated for fourth cumulant) • (Symmetry) is invariant under permutation of indices. • (Multilinearity) Let be a random variable, and let . • (Independence) If and are independent: • Implies for and independent, • (Vanishing Gaussians) . • Cumulants order are 0 for Gaussian random variables.

  10. Quasi-Whitening Algorithm • Definition: Let be a matrix. Then, define an operation of tensors on matrices: • The proposed algorithm:

  11. Algorithm’s Validity Lemma: Let Then where is diagonal, Proof:

  12. Algorithm’s Validity Theorem: Let . Then, • where is diagonal, • Let be a decomposition of , then is a quasi-whitening matrix. Proof sketch (of 2): • is real valued. • ). • for a rotation. • , giving is a quasi-whitening matrix.

  13. Main Result Theorem: Let give the probability of success, and let be an error parameter. Let give the sample estimate of . Let be a decomposition of Given polynomial samples, with probability , is an approximate quasi-matrix such that: • The latent coordinates are approximately orthogonalized. For , • The latent coordinates are scaled:

  14. Quasi-Whitening Algorithm Restated • Provably efficient • Performs the relaxed Step 1 of noisy ICA (quasi-whitening). • Compatible with small variations on existing algorithms for Step 2 of ICA.

  15. Thank You Any Questions?

More Related