1 / 65

Modeling Small Neural Networks

Modeling Small Neural Networks. Baktash Babadi baktash@ipm.ir SCS, IPM Fall 2004. References. Cokh & Segev (1998) Principles of neural modeling, 1998, MIT press Abeles (1991) Corticonics Ermentrout (1998) Neural Networks as pattern forming Systems, Rep. Prog. Phys. ….

jara
Download Presentation

Modeling Small Neural Networks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Modeling Small Neural Networks Baktash Babadi baktash@ipm.ir SCS, IPM Fall 2004

  2. References • Cokh & Segev (1998) Principles of neural modeling, 1998, MIT press • Abeles (1991) Corticonics • Ermentrout (1998) Neural Networks as pattern forming Systems, Rep. Prog. Phys. • …

  3. Studying the Small Neural Networks (1) • The dynamics of single neurons is not taken to be important • The strength of synaptic connections is the important parameters • The dynamical state of the networks is the focus of attention

  4. Vi Wi j Ji j Vj Studying the Small Neural Networks (2) • Network Architecture:

  5. Single Neuron network (1) • The neuron model: • Architecture: W

  6. The model: Parameters: Steady States: Single Neuron network (2)

  7. Single Neuron network (3) • Steady States:

  8. Single Neuron network (4) • Bistability :

  9. Two Neuron Networks (1) W21 • Architecture : • Equations : W11 W22 V1 V2 W12

  10. Two Neuron Networks (2) • Nullclines: • Shape of Nullclines: • Sigmoid: • Cubic

  11. Two Neuron Networks (2) • Analyzed by Beer (1995), Ermentrout (1998): • Minimum number of fixed points: 1 • Maximum number of fixed points: 9 • The cubic nullclines have three rims (outer/inner/outer): • Intersections (fix points) : • Outer-outer : Stable • Inner-Inner: Unstable • Inner-outer: Saddle point • In general 13 dynamical states are possible in the two neuron network.

  12. Two Neuron Networks (3) • Example 1: Two stable fix points, one saddle point:

  13. Two Neuron Networks (4) • Example: 2 saddle points, 3 stable fix points:

  14. Two Neuron Networks (5) • Example: 4 stable fix points, 4 unstable fix points, 1 saddle point:

  15. Two Neuron Networks (6) • Example: 4 stable fix points, 4 unstable fix points, 1 saddle point:

  16. Two Neuron Networks (7) • Example: Bistability

  17. Three Neuron Network (1) • Architecture: • Equations : V3 V1 V2

  18. Three Neuron Network (2) • Chaotic Behavior is possible:

  19. Netlets (1) • In Netlets (Anninos et al 1970) The number of neurons is high, the number of connections is low. • Neuron Model: • Architecture : • The number of neurons (N) is high • Each neuron receives input from a small number of other neurons (n=10)

  20. Netlets (2) • Dynamics: • All the synaptic weights are equal (W=1). • The time is taken to be discrete. We assume that we observe the system at time steps that are equal to the synaptic delay. • If a neuron fires at time k, it will be in refractory period at time k+1. • Since the number of neurons is high, statistical methods should be used

  21. Netlets (3)

  22. Netlets (4)

  23. Netlets (5) • For N=10, W=1, th=5: Time

  24. Netlets (6) • For N=10, W=1, th=3: Time

  25. Netlets (7) • For N=10, W=1, th=1: Time

  26. αk+1 Th=5 Th=3 αk+1 αk αk αk+1 Th=1 αk Netlets (8) • The iterative map :

  27. Question • How does the dynamics of a Netlet change if not all the neurons that fired at time k, but a fraction of them (c) remain in refractory period at time k+1?

  28. Motor Pattern Generators • The first article in Computational Neuroscience: • Brown TG:On the nature of the fundamental activity of the nervous centres; together with an analysis of the conditioning of rhythmic activity in progression, and a theory of the evolution of function in the nervous system.J Physiol 1914,48:18-46. • The

  29. Reflex Loops vs. Central rhythm generation • Is the rhythmic motor patterns due to • reflex loops • Or a center that generates rhythms spontaneously?

  30. Central Pattern Generators (CPG) • Central pattern generators (CPGs) are neural networks that can endogenously (i.e. without rhythmic sensory or central input) produce rhythmic patterned outputs; these networks underlie the production of most rhythmic motor patterns (Marder and Calabrese, 1996; Stein et al., 1997). • The first modern evidence that rhythmic motor patterns are centrally generated was the demonstration that the locust nervous system, when isolated from the animal, could produce rhythmic output resembling that observed during flight.

  31. Fictive Movement • A fictive motor pattern is a pattern of motor neuron firing that would, if the motor neurons were still attached to their muscles, result in the motor pattern in question being produced.

  32. Mechanisms of Rhythm Generation • 1) Rhythms driven by Pace Maker Neurons • Vertebrate Respiratory System • Pyloric Ganglion of Crustacean • 2) Rhythms resulting from synaptic interactions of the neurons. • Usually emerge from mutually inhibitory neurons (reciprocal inhibition). • Called “Half Center Oscillators”

  33. Half Center Oscillators • The mechanisms of transition between activation and inhibition: • Spike frequency adaptation • Escape from inhibition

  34. Cellular Properties of Neurons in CPG • a) Spontaneous Rhythmic firing • b) Plateau Firing • c) Escape from inhibition • d) Post inhibitory rebound • e) Delayed Post inhibitory Rebound

  35. Example: Somatogastric (STG) Nervous System of Crustaceans • Marder & Abbott 1997, Modeling small neural networks:

  36. Source

  37. Precise Firing Sequences (PFS) • Prut et al, 1998:

  38. Synfire Chains • The reproducibility of PFSs implies that there are synchronous pools of neurons in the cortex (Abeles 1991).

  39. The Notion of Synfire Chains • Based on anatomical and physiological data, Abeles(1991) proposed the Synfire model which is: • A locally feed-forward neural network • With convergent/divergent connections pool Link …… …… • Propagation of synchronous patterns in Synfire Chains is an explanation for precise firing sequences

  40. Dynamics of Firing Patterns in Synfire Chains (1) If the Synfire notion is true, the neural activity in successive pools must tend to synchronize and remain synchronous Hermann, Hertz , Prugel-Bennett (1996): • In a simple synfire model of non-leaky integrate-and-fire neurons, the firing patterns in successive pools tend to synchronize.

  41. Pulse Packets in Synfire networks • In order to study the spike synchronization in synfire networks, Aertsen et al (1995) introduced the notion of pulse packet: • A pulse packet is an index of the activity of a neuron pool and is defined by two parameters: • 1) The number of neurons that fire in the pool • 2) The standard deviation of the firing times

  42. Dynamics of Firing Patterns in Feed-forward Networks (1) • Diesmann, Gewaltig, Aertsen (1999) • Bistability in a phase-plane portrait: • Dense and highly synchronous pulse packets will propagate successfully. • Sparse and weakly synchronous ones will eventually dissipate.

  43. Dynamics of Firing Patterns in Feed-forward Networks (2) • Gewaltig, Diesman, Aertsen (2001) • Survival probability of pulse packets in single trials:

  44. Dynamics of Firing Patterns in Synfire Chains (3) • Cateau & Fukai (2001) : • Using Fokker-Planck equations, the previous results have been confirmed analytically.

  45. Common Issues Among All the Mentioned Studies: • The synaptic weights are uniform • The Synfire activity ends in two scenarios: • Saturation : All the neurons in the final pools fire • Decay: No neuron in the final pools fire (apart from the background activity) • In either case there is a loss of information content

  46. Our Question • Is it possible to avoid full saturation and full decay by modifying the synaptic weights of a Synfire network?

  47. Analysis Assumptions : • While we do not aim to study the synchronizing property of the network and only the number of firing neurons is our mater of interest, the network is fed with a fully synchronous pattern. • No synaptic delay is taken into account.

  48. Reformulating the problem: • Assume that we feed the net with an input pattern containing n firing neurons. How should we set the weight parameters ( , ) to avoid full saturation (final n=50) or full decay (final n=0)?

  49. Method of Analysis • Iterative mapping

  50. The Iterative Map

More Related