1 / 42

NOISE and DELAYS in NEUROPHYSICS

NOISE and DELAYS in NEUROPHYSICS. Andre Longtin Center for Neural Dynamics and Computation Department of Physics Department of Cellular and Molecular Medicine UNIVERSITY OF OTTAWA, Canada. OUTLINE. Modeling Single Neuron noise leaky integrate and fire

Download Presentation

NOISE and DELAYS in NEUROPHYSICS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NOISE and DELAYS in NEUROPHYSICS Andre Longtin Center for Neural Dynamics and Computation Department of Physics Department of Cellular and Molecular Medicine UNIVERSITY OF OTTAWA, Canada

  2. OUTLINE • Modeling Single Neuron noise leaky integrate and fire quadratic integrate and fire “transfer function” approach • Modeling response to signals • Information theory • Delayed dynamics

  3. MOTIVATION for STUDYING NOISE

  4. “Noise” in the neuroscience literature • As « internal », resulting from the probabilistic gating of voltage-dependent ion channels • As « synaptic », resulting from the stochastic nature of vesicle release at the synaptic cleft • As « cross-talk » responses from indirectly stimulated neurons • As the maintained discharge of some neurons • As an input with many frequency components over a particular band, of similar amplitudes, and scattered phases • As the resulting current from the integration of many independent, excitatory and inhibitory synaptic events at the soma Segundo et al., Origins and Self Organization, 1994

  5. Leaky Integrate-and-fire with + and - Feedback f = firing rate function

  6. Firing Rate Functions Noise free: Or stochastic:

  7. Noise induced Stochastic Gain Control Resonance

  8. For Poisson input (Campbell’s theorem): mean conductance ~ mean input ratestandard deviation σ ~ sqrt(mean rate)

  9. NOISE smoothes out f-I curves

  10. WHAT QUADRATIC INTEGRATE-AND-FIRE MODEL? • Technically more difficult • Which variable to use? On the real line? On a circle?

  11. Information-theoretic approaches • Linear encoding versus nonlinear processing • Rate code, long time constant , integrator • Time code, small time constant, coincidence detector (reliability) • Interspike interval code (ISI reconstruction) • Linear correlation coefficient • Coherence • Coding fraction • Mutual information • Challenge: Biophysics of coding • Forget the biophysics? Use better (mesoscopic ?) variables?

  12. Neuroscience101 (Continued): Interspike Intervals (ISI): Spiketrain: Number of spikes In time interval T: Random variables Raster Plot:

  13. Information Theoretic Calculations: ??? Gaussian Noise Stimulus S Neuron Spike Train X Coherence Function: Mutual Information Rate:

  14. Stimulus Protocol: Study effect of  (stimulus contrast) and fc (stimulus bandwidth) on coding.

  15. Information Theory:

  16. Linear Response Calculation for Fourier transform of spike train: unperturbed spike train susceptibility Spike Train Spec = Background Spec + (transfer function*Signal Spec)

  17. CV == INTERVAL mean / INTERVAL Standard deviation

  18. Wiener Khintchine Power spectrum Autocorrelation Integral of S over all frequencies = C(0) = signal variance Integral of C over all time lags = S(0) = signal intensity

  19. Signal Detection Theory: ROC curve:

  20. Actual signal Reconstructed signal Information Theory The stimulus can be well characterized (electric field). This allows for detailed signal processing analysis. Gabbiani et al., Nature (1996) 384:564-567. Bastian et al., J. Neurosci. (2002) 22:4577-4590. Krahe et al., (2002) J. Neurosci. 22:2374-2382.

  21. Linear Stimulus Reconstruction • Estimate filter which, when convolved with the spike train, yields an estimated stimulated “closest” to real stimulus Spike train (zero mean) Estimated stimulus Mean square error Optimal Wiener filter

  22. NOISE smoothes out f-I curves

  23. “stochastic resonance above threshold” Coding fraction versus noise intensity:

  24. Modeling Electroreceptors: The Nelson Model (1996) High-Pass Filter Stochastic Spike Generator Input Spike generator assigns 0 or 1 spike per EOD cycle: multimodal histograms

  25. Modeling Electroreceptors: The Extended LIFDT Model High-Pass Filter Input LIFDT Spike Train Parameters: without noise, receptor fires periodically (suprathreshold dynamics – no stochastic resonance)

  26. Signal Detection: Count Spikes During Interval T T=255 msec

  27. Regularisation: Fano Factor: Asymptotic Limit (Cox and Lewis, 1966)

  28. Higher Brain Sensory Input Sensory Neurons ELL Pyramidal Cell

  29. Higher Brain Feedback: Open vs Closed Loop Architecture Higher Brain Loop time td

  30. The ELL; first stage of sensory processing Higher Brain Areas Afferent Input Delayed Feedback Neural Networks

  31. Andre’s data Jelte Bos’ data Longtin et al., Phys. Rev. A 41, 6992 (1990)

  32. If one defines: one gets a Fokker-Planck equation: corresponding to the stochastic diff. eq. :

  33. One can apply Ito or Stratonovich calculus, as for SDE’s. However, applicability is limited if there are complex eigenvalues or system is strongly nonlinear

  34. TWO-STATE DESCRIPTION: S=±1 2 transition probabilities: For example, using Kramers approach:

  35. DETERMINISTIC DELAYED BISTABILITY Stochastic approach does not yet get the whole picture!

  36. Conclusions • NOISE: many sources, many approaches, exercise caution (Ito vs Strato) • INFORMATION THEORY: usually makes assumptions, and even when it doesn’t, ask the question whether next cell cares. • DELAYS: SDDE’s have no Fokker-Planck equivalent  tomorrow: linear response-like theory

  37. OUTLOOK • Second order field theory for stochastic neural dynamics with delays • Figuring out how intrinsic neuron dynamics (bursting, coincidence detection, etc…) interact with correlated input • Figuring out interaction of noise and bursting • Forget about steady state! • Whatever you do, think of the neural decoder…

More Related