Application of statistical techniques to neural data analysis
Download
1 / 10

Application of Statistical Techniques to Neural Data Analysis - PowerPoint PPT Presentation


  • 164 Views
  • Uploaded on

Application of Statistical Techniques to Neural Data Analysis. Aniket Kaloti 03/07/2006. Introduction . Levels of Analysis in Systems and Cognitive Neuroscience Spikes: primary neural signals Single cells and receptive fields Multiple electrode recordings fMRI EEG and ERPs.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Application of Statistical Techniques to Neural Data Analysis' - bethan


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Application of statistical techniques to neural data analysis

Application of Statistical Techniques to Neural Data Analysis

Aniket Kaloti

03/07/2006


Introduction
Introduction Analysis

  • Levels of Analysis in Systems and Cognitive Neuroscience

    • Spikes: primary neural signals

    • Single cells and receptive fields

    • Multiple electrode recordings

    • fMRI

    • EEG and ERPs

Retinal Ganglion Cell

Receptive Field

Visual Cortical (V1) Cell

Receptive Field


Receptive field estimation a new information theoretic method sharpee et al 2004
Receptive Field Estimation: A New Information Theoretic Method (Sharpee et al, 2004)

  • V1 cells of primary concern

  • Linear-Nonlinear Model: estimate the Wiener filter, estimate non-linearity graphically

  • Classically, white noise stimuli were used

  • Works best for Gaussian stimulus ensembles

  • Natural Stimuli: non-Gaussian

From Simoncelli et al, 2003


The model
The Model Method (Sharpee et al, 2004)

  • Receptive field as a special dimension in the high-dimensional stimulus space

  • Hence, reduce dimensionality of the stimulus space conditioned on the neural response

  • To formulate this, define the density 

  • Ispike defines the mutual information between the entire stimulus ensemble and the spike

  • In practice, use the time average equation

Sharpee et al, 2004


Optimization algortihm and results
Optimization Algortihm and Results Method (Sharpee et al, 2004)

  • Finding “most informative” dimensions:

    • Ispike: total mutual information;

    • If only a few dimensions in the stimulus space are relevant, then Ispike should be equal to mutual information between spike and the relevant subspace in the direction of the vector v

    • Find the pdfs of the projections onto the relevant subspace v

    • Maximize Iv with respect to v to obtain the relevant dimension, i.e., the receptive field

  • Figure: the comparison of the standard method with the present method applied on model in last slide


Independent component analysis ica
Independent Component Analysis (ICA) Method (Sharpee et al, 2004)

  • Blind source separation

  • Blind: input and transfer function unknown

  • Very ill-posed without further assumptions

    • f linear A, usually symmetric

    • s are independent (hence ICA)

    • Most commonly: n is zero

  • Independece: joint density factorizes

  • Independence: mutual information is zero

  • The problem: estimate independent sources through inversion of the matrix A.

Observed signals

Unknown sources

Additive/observational noise

Unknown function


Ica estimation techniques
ICA Estimation Techniques Method (Sharpee et al, 2004)

  • Basic idea: minimize mutual information between the components of s.

  • Maximum likelihood (ML) method

    • Likelihood definition

    • Log-likelihood

    • Batch of T samples

    • Use W = A-1

  • Maximize L; equivalent to minimizing mutual information


Ica estimation contd
ICA estimation (contd.) Method (Sharpee et al, 2004)

  • Cumulant (moment) based methods: kurtosis = fourth central moment; mutual information approximations involving kurtosis

  • Negentropy: difference of entropies between Gaussian vector and the vector of interest; measure of non-Gaussianity

  • Infomax ICA: maximize information transmission in a neural network


Applications of ica
Applications of ICA Method (Sharpee et al, 2004)

  • EEG and ERP analysis

    • Infomax ICA most commonly applied technique; gives rise to temporally independent EEG signals

    • Independent components: can they tell us anything about the brain activity?

  • fMRI: spatially independent processes (?)

  • Speech separation

  • Natural images: independent components give V1 like receptive fields

Source: www.bnl.gov/neuropsychology/ERPs_al.asp


Other techniques applicable to neural science
Other techniques applicable to neural science Method (Sharpee et al, 2004)

  • Point process analysis of neural coding

  • Information theoretic analysis of information coding in the neural system

  • Principal components analysis to neural recordings and spike sorting

  • Recently developed nonlinear dimensionality reduction techniques like Isomap, Hessian eigenmaps, Laplacian eigenmaps etc in face and object recognition.


ad