1 / 51

Kinetic Theory for the Dynamics of Fluctuation-Driven Neural Systems

Kinetic Theory for the Dynamics of Fluctuation-Driven Neural Systems. David W. McLaughlin Courant Institute & Center for Neural Science New York University http://www.cims.nyu.edu/faculty/dmac/ Toledo – June ‘06. Happy Birthday, Peter & Louis.

arav
Download Presentation

Kinetic Theory for the Dynamics of Fluctuation-Driven Neural Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Kinetic Theory for the Dynamicsof Fluctuation-Driven Neural Systems David W. McLaughlin Courant Institute & Center for Neural Science New York University http://www.cims.nyu.edu/faculty/dmac/ Toledo – June ‘06

  2. Happy Birthday, Peter & Louis

  3. Kinetic Theory for the Dynamicsof Fluctuation-Driven Neural Systems In collaboration with:   David Cai Louis Tao Michael Shelley Aaditya Rangan

  4. Visual Pathway: Retina --> LGN --> V1 --> Beyond

  5. Integrate and Fire Representation t v= -(v – VR) – g (v-VE)  t g= - g + l f (t – tl) + (Sa/N) l,k (t – tlk) plus spike firing and reset v(tk) = 1; v(t = tk + ) = 0

  6. Nonlinearity from spike-threshold: Whenever V(x,t) = 1, the neuron "fires", spike-time recorded, and V(x,t)is reset to 0 ,

  7. The “primary visual cortex (V1)” is a “layered structure”, with O(10,000) neurons per square mm, per layer.

  8. Map of Orientation Preference O(104) neuons per mm2 With both regular & random patterns of neurons’ preferences

  9. Lateral Connections and Orientation -- Tree Shrew Bosking, Zhang, Schofield & Fitzpatrick J. Neuroscience, 1997

  10. Line-Motion-Illusion LMI

  11. Coarse-Grained Asymptotic Representations Needed for “Scale-up” Larger lateral area Multiple layers

  12. First, tile the cortical layer with coarse-grained (CG) patches

  13. Coarse-Grained Reductions for V1 Average firing rate models[Cowan & Wilson (’72); ….; Shelley & McLaughlin(’02)] Average firing rate of an excitatory (inhibitory) neuron, within coarse-grained patch located at location x in the cortical layer: m(x,t),  = E,I

  14. Cortical networks have a very “noisy” dynamics Strong temporal fluctuations On synaptic timescale Fluctuation driven spiking

  15. Experiment Observation Fluctuations in Orientation Tuning (Cat data from Ferster’s Lab) Ref: Anderson, Lampl, Gillespie, Ferster Science, 1968-72 (2000)

  16. Fluctuation-driven spiking (very noisy dynamics, on the synaptic time scale) Solid: average ( over 72 cycles) Dashed: 10 temporal trajectories

  17. To accurately and efficiently describe these networks requires that fluctuations be retained in a coarse-grained representation. • “Pdf ” representations – (v,g;x,t),  = E,I will retain fluctuations. • But will not be very efficient numerically • Needed – a reduction of the pdf representations which retains • Means & • Variances • Kinetic Theory provides this representation Ref: Cai, Tao, Shelley & McLaughlin, PNAS, pp 7757-7762 (2004)

  18. Kinetic Theory begins from PDF representations (v,g;x,t),  = E,I • Knight & Sirovich; • Nykamp & Tranchina, Neural Comp (2001) • Haskell, Nykamp & Tranchina, Network (2001) ;

  19. For convenience of presentation, I’ll sketch the derivation a single CG patch, with 200 excitatory Integrate & Fire neurons • First, replace the 200 neurons in this CG cell by an equivalent pdf representation • Then derive from the pdf rep, kinetic theory • The results extend to interacting CG cells which include inhibition – as well as different cell types such as “simple” & “complex” cells.

  20. N excitatory neurons (within one CG cell) • Random coupling throughout the CG cell; • AMPA synapses (with a short time scale ) t vi = -(vi – VR) – gi (vi -VE)  t gi = - gi + l f (t – tl) + (Sa/N) l,k (t – tlk) plus spike firing and reset vi (tik) = 1; vi (t = tik + ) = 0

  21. N excitatory neurons (within one CG cell) • Random coupling throughout the CG cell; • AMPA synapses (with time scale ) t vi = -(v – VR) – gi (v-VE)  t gi = - gi + l f (t – tl) + (Sa/N) l,k (t – tlk) (g,v,t)  N-1 i=1,N E{[v – vi(t)] [g – gi(t)]}, Expectation “E” over Poisson spike train { tl }

  22. t vi = -(v – VR) – gi (v-VE)  t gi = - gi + l f (t – tl) + (Sa/N) l,k (t – tlk) Evolution of pdf -- (g,v,t): (i) N>1; (ii) the total input to each neuron is (modulated) Poisson spike trains. t  = -1v {[(v – VR) + g (v-VE)] } + g {(g/) } + 0(t) [(v, g-f/, t) - (v,g,t)] + N m(t) [(v, g-Sa/N, t) - (v,g,t)], 0(t) = modulated rate of incoming Poisson spike train; m(t) = average firing rate of the neurons in the CG cell =  J(v)(v,g; )|(v= 1) dg, and where J(v)(v,g; ) = -{[(v – VR) + g (v-VE)] }

  23. t  = -1v {[(v – VR) + g (v-VE)] } + g {(g/) } + 0(t) [(v, g-f/, t) - (v,g,t)] + N m(t) [(v, g-Sa/N, t) - (v,g,t)], N>>1; f << 1; 0 f = O(1); t  = -1v {[(v – VR) + g (v-VE)] } + g {[g – G(t)]/) } + g2/ gg  + … where g2 = 0(t) f2 /(2) + m(t) (Sa)2 /(2N) G(t) = 0(t) f + m(t) Sa

  24. Kinetic Theory Begins from Moments • (g,v,t) • (g)(g,t) =  (g,v,t) dv • (v)(v,t) =  (g,v,t) dg • 1(v)(v,t) =  g (g,tv) dg where (g,v,t) = (g,tv) (v)(v,t). t  = -1v {[(v – VR) + g (v-VE)] } + g {[g – G(t)]/) } + g2/ gg  + … First, integrating (g,v,t) eq over v yields:  t (g) =g {[g – G(t)]) (g)} + g2 gg (g)

  25. Fluctuations in g are Gaussian  t (g) =g {[g – G(t)]) (g)} + g2 gg (g)

  26. Integrating (g,v,t) eq over g yields: t (v) = -1v [(v – VR) (v) + 1(v)(v-VE) (v)] Integrating [g (g,v,t)] eq over g yields an equation for 1(v)(v,t) =  g (g,tv) dg, where (g,v,t) = (g,tv) (v)(v,t)

  27. t 1(v) = - -1[1(v) – G(t)] + -1{[(v – VR) + 1(v)(v-VE)] v 1(v)} +2(v)/ ((v)) v [(v-VE) (v)] + -1(v-VE) v2(v) where 2(v) = 2(v) – (1(v))2 . Closure: (i) v2(v) = 0; (ii) 2(v) = g2 One obtains:

  28. t (v) = -1v [(v – VR) (v)+ 1(v)(v-VE) (v)] t 1(v) = - -1[1(v) – G(t)] + -1{[(v – VR) + 1(v)(v-VE)] v 1(v)} + g2 / ((v)) v [(v-VE) (v)] Together with a diffusion eq for (g)(g,t):  t (g) =g {[g – G(t)]) (g)} + g2 gg (g)

  29. Fluctuation-Driven Dynamics PDF of v Theory→ ←I&F (solid) Fokker-Planck→ Theory→ ←I&F ←Mean-driven limit ( ): Hard thresholding N=75 firing rate (Hz) N=75 σ=5msec S=0.05 f=0.01

  30. Bistability and Hysteresis • Network of Simple, Excitatory only N=16! N=16 Mean­Driven: Fluctuation­Driven: Relatively Strong Cortical Coupling:

  31. Bistability and Hysteresis • Network of Simple, Excitatory only N=16! Mean­Driven: Relatively Strong Cortical Coupling:

  32. Computational Efficiency • For statistical accuracy in these CG patch settings, Kinetic Theory is 103 -- 105 more efficient than I&F;

  33. Realistic Extensions Extensions to coarse-grained local patches, to excitatory and inhibitory neurons, and to neurons of different types (simple & complex). The pdf then takes the form ,(v,g;x,t), where x is the coarse-grained label,  = E,I and labels cell type

  34. Three Dynamic Regimes of Cortical Amplification: 1) Weak Cortical Amplification No Bistability/Hysteresis 2) Near Critical Cortical Amplification 3) Strong Cortical Amplification Bistability/Hysteresis (2) (1) (3) Excitatory Cells Shown

  35. Firing rate vs. input conductance for 4 networks with varying pN: 25 (blue), 50 (magneta), 100 (black), 200 (red). Hysteresis occurs for pN=100 and 200. Fixed synaptic coupling Sexc/pN

  36. Summary • Kinetic Theory is a numerically efficient (103 -- 105 more efficient than I&F), and remarkably accurate, method for “scale-up” Ref: PNAS, pp 7757-7762 (2004) • Kinetic Theory introduces no new free parameters into the model, and has a large dynamic range from the rapid firing “mean-driven” regime to a fluctuation drivenregime. • Sub-networks of point neurons can be embedded within kinetic theory to capture spike timing statistics, with a range from test neurons to fully interacting sub-networks. Ref: Tao, Cai, McLaughlin, PNAS, (2004)

  37. Too good to be true? What’s missing? • First, the zeroth moment is more accurate than the first moment, as in many moment closures

  38. Too good to be true? What’s missing? • Second, again as in many moment closures, existence can fail -- (Tranchina, et al – 2006). • That is, at low but realistic firing rates, equations too rigid to have steady state solutions which satisfy the boundary conditions. • Diffusion (in v) fixes this existence problem – by introducing boundary layers

  39. Too good to be true? What’s missing? • But a far more serious problem • Kinetic Theory does not capture detailed “spike-timing” information

  40. Whydoes the kinetic theory (Boltzman-type approach in general) not work? Note

  41. Too good to be true? What’s missing? • But a far more serious problem • Kinetic Theory does not capture detailed “spike-timing” statistics

  42. Too good to be true? What’s missing? • But a far more serious problem • Kinetic Theory does not capture detailed “spike-timing” statistics • And most likely the cortex works, on very short time time scales, through neurons correlated by detailed spike timing. • Take, for example, the line-motion illusion

  43. Line-Motion-Illusion LMI

  44. Stimulus Model Voltage 0 • Direct ‘naïve’ coarse graining • may not suffice: • Priming mechanism relies on Recruitment • Recruitment relies on locally correlated cortical firing events • Naïve ensemble average destroys locally correlated events time 128 Model NMDA space Trials 0% ‘coarse’ 40% ‘coarse’

More Related