1 / 36

GUM*02 tutorial session UTSA, San Antonio, Texas

GUM*02 tutorial session UTSA, San Antonio, Texas. Large-scale realistic modeling of neuronal networks Mike Vanier, Caltech. Structure of the talk:. General network modeling issues Details of how networks are modeled in GENESIS. Part 1. General network modeling issues

Download Presentation

GUM*02 tutorial session UTSA, San Antonio, Texas

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. GUM*02 tutorial sessionUTSA, San Antonio, Texas Large-scale realistic modeling of neuronal networks Mike Vanier, Caltech

  2. Structure of the talk: • General network modeling issues • Details of how networks are modeled in GENESIS

  3. Part 1 • General network modeling issues • Details of how networks are modeled in GENESIS

  4. Why model networks? • Goal: understand the brain • network of networks • Networks implement computations • influence of NN theory • Networks are where the action is!

  5. Why avoid modeling networks? • networks are too complex • dozens of cell types • complex connectivities, interactions • we don’t understand neurons yet • not enough data • want to graduate quickly

  6. Roots of GENESIS • GENESIS: • GEneral • NEural • SImulation • System • network modeling was orig focus

  7. and yet... • most models still either • single neuron models • very small networks • “abstract” network models • maybe a 10:1 ratio or worse • why is this?

  8. Network modeling is hard!!! • need accurate data on: • neuron models (ALL types) • connectivities • inputs • outputs • simplifications needed • scaling issues

  9. More typical scenario • data available for some neurons only • inhibitory neurons? • connectivities only vaguely known • inputs vaguely known if at all • outputs vaguely known if at all • why bother?

  10. Motivations “Abandon all hope, ye who enter here.” • more exploratory, less definitive • refine conceptual model of system • make implicit ideas about function explicit • figure out what data to collect

  11. The process • collect all the data you can!!! • build simplified neuron models • match to data • build model of inputs • build network model • match to data • graduate

  12. Example: piriform cortex • neuron types well established • little physiology for most • connection patterns known • inputs partially known • outputs mostly unknown

  13. Neuron types

  14. Simplification

  15. Physiology: pyramidal neurons model real

  16. Physiology: inhibitory neurons

  17. inputs spike rasters ISI distribution

  18. Connectivities 1 afferents

  19. Connectivities 2

  20. now the “fun” begins... • pick network phenomenon to model • PC: response to strong, weak shocks • independent of details of bulb • relatively simple • adjust parameters to tune model • leave neuron parameters alone • connectivities

  21. results? • see my talk tomorrow • hint: I graduated

  22. Part 2 • General network modeling issues • Details of how networks are modeled in GENESIS

  23. GENESIS basics • modeler creates simulation objects • objects send messages to ea. other • messages contain data • field values • most messages sent each time step • or once per fixed interval • [spikes break this rule]

  24. neurons • compartmental models of neurons • neuron composed of compartments • compartments are isopotential • channels connect to compartments • voltage-dependent • calcium-dependent • synaptic

  25. setting up the neuron create neutral /neuron1 create compartment /neuron1/soma setfield ^ \ Em { Erest } \ // volts Rm { RM / area } \ // Ohms Cm { CM * area } \ // Farads Ra { RA * len / xarea } // Ohms

  26. spikes in genesis • spikegen object • monitors Vm of compartment • when past threshold, sends SPIKE message to destination • synchan object • receives SPIKE message • stores time of spike in buffer • generates a-function when spike hits

  27. setting up the synchan create synchan /neuron1/syn setfield ^ \ gmax 1.0e-9 \ // 1 nS Ek 0.0 \ tau1 0.001 \ // rise time (sec) tau2 0.003 // fall time // Connect soma to synchan: addmsg /neuron1/soma /neuron1/syn VOLTAGE Vm addmsg /neuron1/syn /neuron1/soma CHANNEL Gk Ek

  28. setting up the spikegen // Create and connect spike detector: create spikegen /neuron1/spike setfield ^ thresh -0.020 abs_refract 0.002 addmsg /neuron1/soma /neuron1/spike INPUT Vm

  29. connecting two neurons // Assume we have neuron2 like neuron1 addmsg /neuron1/spike /neuron2/syn SPIKE // Set synaptic weight and delay: setfield /neuron2/syn \ synapse[0].weight 1.0 \ synapse[0].delay 0.001 // 1 msec // That’s all there is to it!

  30. building networks • Why not just do this for all synapses? • 100-1000 neurons, 10,000-100,000 synapses... • gets pretty tedious • faster way: large-scale connection commands • volumeconnect [planarconnect] • volumedelay [planardelay] • volumeweight [planarweight]

  31. volumeconnect volumeconnect source_elements destination_elements \ -relative \ -sourcemask {box, ellipsoid} x1 y1 z1 x2 y2 z2 \ -sourcehole {box, ellipsoid} x1 y1 z1 x2 y2 z2 \ -destmask {box, ellipsoid} x1 y1 z1 x2 y2 z2 \ -desthole {box, ellipsoid} x1 y1 z1 x2 y2 z2 \ -probability p

  32. volumedelay volumedelay sourcepath [destination_path] \ -fixed delay \ -radial conduction_velocity \ -add \ -uniform scale \ -gaussian stdev maxdev \ -exponential mid max \ -absoluterandom

  33. volumeweight volumeweight sourcepath [destination_path] \ -fixed weight \ -decay decay_rate max_weight min_weight \ -uniform scale \ -gaussian stdev maxdev \ -exponential mid max \ -absoluterandom

  34. note on connection commands • mainly useful for simple cases • more realistic cases require more control • GENESIS script language makes it easy to write own connection commands

  35. output • Xodus • graphical output • dump neuron data to files • binary files readable by “xview”

  36. conclusions • network modeling is • fun • fascinating • fundamental • frustrating! • NOT for the easily discouraged!

More Related