Loading in 2 Seconds...
Loading in 2 Seconds...
Independent Component Analysis & Blind Source Separation. Ata Kaban The University of Birmingham. Overview. Today we learn about The cocktail party problem -- called also ‘blind source separation’ (BSS) Independent Component Analysis (ICA) for solving BSS Other applications of ICA / BSS
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
The University of Birmingham
in the sense of having to find quantities that are not observable directly
xi(t) = ai1*s1(t) + ai2*s2(t) + ai3*s3(t) + ai4*s4(t)
Called also Blind Source Separation (BSS) problem
Ill posed problem, unless assumptions are made!
The most common assumption is that source signals are statistically independent. This means that knowing the value of one of them does not give any information about the other.
The methods based on this assumption are called Independent Component Analysis methods. These are statistical techniques of decomposing a complex data set into independent parts.
It can be shown that under some reasonable conditions, if the ICA assumption holds, then the source signals can be recovered up to permutation and scaling.
Determine the source signals, given only the mixtures
There is one case when rotation doesn’t matter. This case cannot be solved by basic ICA.
Example of non-Gaussian density (-) vs.Gaussian (-.)
Seek non-Gaussian sources for two reasons:* identifiability* interestingness: Gaussians are not interesting since the superposition of independent sources tends to be Gaussian
…when both densities are Gaussian
0) Centring = make the signals centred in zero
xi xi - E[xi] for each i
1) Sphering = make the signals uncorrelated. I.e. apply a transform V to x such that Cov(Vx)=I // where Cov(y)=E[yyT] denotes covariance matrix
V=E[xxT]-1/2 // can be done using ‘sqrtm’ function in MatLab
xVx // for all t (indexes t dropped here)
// bold lowercase refers to column vector; bold upper to matrix
Scope: to make the remaining computations simpler. It is known that independent variables must be uncorrelated – so this can be fulfilled before proceeding to the full ICA
This is based on an the maximisation of an objective function G(.) which contains an approximate non-Gaussianity measure.
Fixed Point Algorithm
Random init of W
Iterate until convergence:
Output: W, S
where g(.) is derivative of G(.), W is the rotation transform soughtΛis Lagrange multiplier to enforce that W is an orthogonal transform i.e. a rotation
Solve by fixed point iterations
The effect ofΛ is an orthogonal de-correlation
Blind source separation (Bell&Sejnowski, Te won Lee, Girolami, Hyvarinen, etc.)
Image denoising (Hyvarinen)
Medical signal processing – fMRI, ECG, EEG (Mackeig)
Modelling of the hippocampus and visual cortex (Lorincz, Hyvarinen)
Feature extraction, face recognition (Marni Bartlett)
Compression, redundancy reduction
Watermarking (D Lowe)
Clustering (Girolami, Kolenda)
Time series analysis (Back, Valpola)
Topic extraction (Kolenda, Bingham, Kaban)
Scientific Data Mining (Kaban, etc)
In multi-variate data search for the direction along of which the projection of the data is maximally non-Gaussian = has the most ‘structure’
Decomposition using ICA