modeling and estimation of dependent subspaces l.
Download
Skip this Video
Download Presentation
Modeling and Estimation of Dependent Subspaces

Loading in 2 Seconds...

play fullscreen
1 / 31

Modeling and Estimation of Dependent Subspaces - PowerPoint PPT Presentation


  • 104 Views
  • Uploaded on

Modeling and Estimation of Dependent Subspaces. J. A. Palmer 1 , K. Kreutz-Delgado 2 , B. D. Rao 2 , Scott Makeig 1 1 Swartz Center for Computational Neuroscience 2 Department of Electrical and Computer Engineering University of California San Diego September 11, 2007. Outline.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Modeling and Estimation of Dependent Subspaces' - cynara


Download Now An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
modeling and estimation of dependent subspaces

Modeling and Estimation of Dependent Subspaces

J. A. Palmer1, K. Kreutz-Delgado2, B. D. Rao2, Scott Makeig1

1Swartz Center for Computational Neuroscience

2Department of Electrical and Computer Engineering

University of California San Diego

September 11, 2007

outline
Outline
  • Previous work on adaptive source densities
  • Types of dependency
    • Variance dependency
    • Skew dependency
    • Non-radially symmetric dependency
  • Normal Variance-Mean Mixtures
  • Examples from EEG
independent source densities
Independent Source Densities

A general classification of sources: Sub- and Super-Gaussian

  • Super-Gaussian = more peaked, than Gaussian, heavier tail
  • Sub-Gaussian = flatter, more uniform, shorter tail than Gaussian

Super-Gaussian

Gaussian

Sub-Gaussian

Sub- AND Super-Gaussian

extended infomax
Extended Infomax

The (independent) source models used in the Extended Infomax algorithm (Lee) are:

Super-Gaussian (Logistic)

Sub-Gaussian (Gaussian mixture)

scale mixture representation
Scale Mixture Representation
  • Gaussian Scale Mixtures (GSMs) are sums of Gaussians densities with different variances, but all zero mean:
  • A random variable with a GSM density can be represented as a product of Standard Normal random variable Z,and an arbitrary non-negative random variable :

Gaussian Scale Mixture

Gaussians

  • Sums of random number of random variables leads to GSM (Renyi)
  • Multivariate densities can be modeled by product non-negative scalar and Gaussian random vector:

X = 1/2 Z

super gaussian mixture model
Super-Gaussian Mixture Model
  • Generalize of Gaussian mixture model to super-Gaussian mixtures:
  • The update rules are similar to the Gaussian mixture model, but include the variational parameters ,
gaussian scale mixture examples 1
Gaussian Scale Mixture Examples 1
  • Generalized Gaussian, 0 <  < 2:
  • Mixing density is related to positive alpha stable density, S/2:
gaussian scale mixture examples 2
Gaussian Scale Mixture Examples 2
  • Generalized Logistic,  > 0:
  • Mixing density is Generalized Kolmogorov:
gaussian scale mixture examples 3
Gaussian Scale Mixture Examples 3
  • Generalized Hyperbolic:
  • Mixing density is Generalized Inverse Gaussian:
dependent subspaces
Dependent Subspaces
  • Dependent sources modeled by Gaussian scale mixture, i.e. Gaussian vector with common scalar multiplier, yielding “variance dependence”:
dependent multivariate densities
Dependent Multivariate Densities
  • Multiply Gaussian vector by common scalar:
  • For GSM evaluated at :
  • Taking derivatives of both sides:
dependent multivariate densities12
Dependent Multivariate Densities
  • Thus, given univariate GSM, can form multivariate GSM:
  • Define the linear operator V :
  • Then we have,
  • Posterior moments can be calculated for EM:
slide13

Examples in R3

  • Given a univariate GSM p(x), a dependent multivariate density in R3 is given by:
  • Example: Generalized Gaussian:
  • Example: Generalized Logistic:
non radial symmetry
Non-radial Symmetry
  • Use Generalized Gaussian vectors to model non-radially symmetric dependence:
generalized hyperbolic
Generalized Hyperbolic
  • The Generalized Hyperbolic density (Barndorff-Nielsen, 1982) is a GSM:
  • For a Generalized Gaussian scale mixture,
  • The posterior is Generalized Inverse Gaussian:
hypergeneralized hyperbolic
Hypergeneralized Hyperbolic
  • This yields the “Hypergeneralized Hyperbolic density”
  • The posterior moment for EM is given by:
generalized gauss scale mixtures
Generalized Gauss. Scale Mixtures
  • More generally, evaluating a multivariate GSM at xp/2:
  • Integrating this over Rdwe get:
  • Thus, given a multivariate GSM, we can formulate a multivariate GGSM:
skew dependence
Skew Dependence
  • Skew is modeled with “location-scale mixtures”:
skew models
Skew Models
  • For a multivariate GSM:
  • Now, for any vector , we have:
  • This can be written in the form,
  • This is equivalent to the following model:
slide20
EEG

brain sources

ocular sources

scalp muscle sources

external EM sources

heartbeat

variance dependency
Variance Dependency
  • Variance dependence can be estimated directly using 4th order cross moments
  • Find covariance of source power:
  • Finds components whose activations are “active” at the same times, “co-modulated”
slide28

Mutual Information/Power Covariance

  • Most of the dependence in mutual information is captured by covariance of power. Summed over 50 lags
  • Some pairs of sources are more sensitive to variance dependence.
marginal histograms are sparse
However product density is approximately “radially symmetric”

Radially symmetric non-Gaussian densities are dependent

Marginal Histograms are “Sparse”
conclusion
Conclusion
  • We described a general framework for modeling dependent sources
  • Estimation of model parameters is carried out using the EM algorithm
  • Models include variance dependency, non-radial symmetric dependence, and skew dependence
  • Application to analysis of EEG sources