1 / 19

Independent Factor Analysis

Independent Factor Analysis. H. Attias University of California. 1. Statistical Modeling and Bind Source Seperation. BBS (blind source separation) problem L’ sensors , L source signals Source signals : mutually independent Sources are not observable and unknown

orson-case
Download Presentation

Independent Factor Analysis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Independent Factor Analysis H. Attias University of California

  2. 1. Statistical Modeling and Bind Source Seperation • BBS (blind source separation) problem • L’ sensors , L source signals • Source signals : mutually independent • Sources are not observable and unknown • Mixing process(linear) and noise unknown • Orderly Factor Analysis • Cannot perform BSS • Gaussian model for p(xj) : 2nd order statistics -> rotation-invariant in factor space • Attacks : projection pursuit, generalized additive models

  3. ICA • Mixing is squre (L’ = L), invertible, instantaneous and noiseless • Non-Gaussian p(xj) : not rotation-invariant, maximum-likelihood of mixing matrix is unique • p(xj) : restricted • Gradient-ascent maximization methods • IFA • p(xj) : non-Gaussian • Generative model : independent sources • EM method • 2 steps • Learning IF model : mixing matrix, noise covariance, source density • Source reconstruction

  4. 2. Independent Factor Generative Model • Noise : • IF parameters : • Model sensor decity

  5. Source Model : Factorial Mixture of Gaussians • P(xi) : need to be general & t ractable • MOG (mixture of Gaussian model) • q(xi) : work as hidden states

  6. Strongly constraint • Modification of mean & variance of a single source states qi would result in shifting a whole column of q => “factorial MOG” • Sensor Model

  7. Generation of sensor signal y • (i) Pick a unit qi for each source i with probability • (ii) • Top-down first-order Markov chain

  8. Co-adaptive MOG • Rotation & scailing of whole line of states

  9. 3. Learning the IF Model • Error Function & the Maximum Likelyhood • Kullback-Leibler distance • Maximizing E : maximizing likelyhood of data • Relation to mean square point-by-point distance

  10. EM Algorithm • (E) step :calculate the expected value of the complete-data likelyhood • (M) step : minimize

  11. Parameter is given by

  12. 4. Recovering the Sources • If noise free and mixing is invertible, • 2 ways • LMS estimator, MAP estimator • Both are non-linear functions of the data • Each satisfies a different optimality criterion

  13. LMS Estimator • Minimizes where, • MAP Estimator • Maximizes the source posterior p(x | y) • Simple way : iterative method of gradient ascent

  14. 5. IFA : Simulation Results • 5sec-long speech, music signal and synthesized signal

  15. 6. IFA with Many Sources: Factorized Variational Approximation • EM becomes intractable as the number of sources in IF model increases. (exponentially with number of sources) • Intractability is the choise of p’ as the exact posterior • Variational approach : feedforward probabilistic models • Factorized Posterior • IF model : sources conditioned on a data vector are correlated • : non-diagonal • In the factorized variational approximation : even when conditioned on a data vector, the sources are independent

  16. EM learning rule

  17. Mean-Field Equations • Learning rules for  are similarly derived by fixing W=W’ and solving

More Related