1 / 31

7. Associators and synaptic plasticity

7. Associators and synaptic plasticity. Fundamentals of Computational Neuroscience, T. P. Trappenberg , 2002. Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science and Engineering

yaphet
Download Presentation

7. Associators and synaptic plasticity

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 7. Associators and synaptic plasticity Fundamentals of Computational Neuroscience, T. P. Trappenberg, 2002. Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science and Engineering Graduate Programs in Cognitive Science, Brain Science and Bioinformatics Brain-Mind-Behavior Concentration Program Seoul National University E-mail: btzhang@bi.snu.ac.kr This material is available online at http://bi.snu.ac.kr/

  2. Outline

  3. 7.1 Associative memory and Hebbian learning • To find the general principles of brain development is one of the major scientific quests in neuroscience • Not all characteristics of the brain can be specified by a genetic code • The number of genes would certainly be too small to specify all the detail of the brain networks • Advantageous that not all the brain functions are specified genetically • To adapt to particular circumstances in the environment • An important adaptation mechanism that is thought to form the basis of building associations • Adapting synaptic efficiencies (learning algorithm)

  4. 7.1.1 Synaptic plasticity • Synaptic plasticity is a major key to adaptive mechanisms in the brain • Artificial neural networks • Abstract synaptic plasticity • Learning rules are not biologically realistic • Learn entirely from experience • Genetic coding would be of minimal importance in the brain development • The mechanism of a neural network for • Self-organization • Associative abilities

  5. 7.1.2 Hebbian learning • Donald O. Hebb, The Organization of Behavior • “When an axon of a cell A is near enough to excite cell B or repeatedly or persistently takes part in firing it, some growth or metabolic change takes place in both cells such that A’s efficiency, as one of the cells firing B, is increased.” • Brain mechanisms and how they can be related to behavior • Cell assemblies • The details of synaptic plasticity • Experimental result and evidence • Hebbian learning

  6. 7.1.3 Associations • Computer memory • Information is stored in magnetic or other physical form • Using memory address to recalling • Natural systems cannot work with such demanding precision • The human memory • Recall vivid memories of events from small details • Learn associations • Trigger memories based on related information • Only partial information can be sufficient to recall memories • Association memory • The basis for many cognitive functions

  7. 7.1.4 The associative node7.1.5 The associative network Fig. 7.1 Associative node and network architecture. (A) A simplified neuron that receives a large number of inputs riin. The synaptic efficiency is denoted by wi. the output of the neuron, rout depends on the particular input stimulus. (B) A network of associative nodes. Each component of the input vector, riin, is distributed to each neuron in the network. However, the effect of the input can be different for each neuron as each individual synapse can have different efficiency values wij, where j labels the neuron in the network.

  8. 7.2 An example of learning associations • (A) , threshold = 1.5 (7.1) Fig. 7.2 Examples of an associative node that is trained on two feature vectors with a Hebbian-type learning algorithm that increases the synaptic strength by δw = 0.1 each time a presynaptic spike occurs in the same temporal window as a postsynaptic spike

  9. 7.2.1 Hebbian learning in the conditioning framework • The mechanisms of an associative neuron • The first stimulus was already effective in eliciting a response of neuron before neuron • Unconditioned stimulus (USC) • Based on the random initial weight distribution • For the second input the response of the neuron changes during learning • Conditioned stimulus (CS) • Fig. 7.3 Different models of associative nodes resembling the principal architecture found in biological nervous systems such as (A) cortical neurons in mammalian cortex

  10. 7.2.2 Alternative plasticity schemes Fig. 7.3 Different models of associative nodes resembling the principal architecture found in biological nervous systems such as (B) Purkinje cells in the cerebellum, which have strong input from climbing fibers through many hundreds or thousands of synapses. In contrast, the model as shown in (C) that utilizes specific input to a presynaptic terminal as is known to exist in invertebrate systems, would have to supply the UCS to all synapses simultaneously in order to achieve the same kind of result as in the previous two models. Such architectures are unlikely to play an important role in cortical processing.

  11. 7.2.3 Issues around synaptic plasticity • Store information with associative learning • Imprinting an event-response pattern • Recall the response from partial information about the event • Synaptic plasticity is thought to be the underlying principle behind associative memory • Formulate the learning rules more precisely • Synaptic potentiation • The synaptic efficiencies would then become too large so that the response of the node is less specific to input pattern • Synaptic depression

  12. 7.3 The biochemical basis of synaptic plasticity • Activity-dependent synaptic plasticity • The co-activation of pre- and postsynaptic neurons • Backfiring • The basis of signaling the postsynaptic state • NMDA receptors • Open when the postsynaptic membrane becomes depolarized • Allow influx of calcium ions • The excess of intracellular calcium can thus indicate the co-activation of pre- and postsynaptic activity • Longlasting synaptic changes • Lifelong memories • The phosphorylation of proteins

  13. 7.4 The temporal structure of Hebbian plasticity: LTP and LTD7.4.1 Experimental example of Hebbian plasticity • Results of experiments with varying pre- and postsynaptic conditions • EPSC: EPSP-related current Fig. 7.4 (A) Relative EPSC amplitudes between glutamatergic neurons in hippocampal slices. A strong postsynaptic stimulation was introduced at the t =0 for 1 minute that induced spiking of the postsynaptic neuron. The postsynaptic firing was induced in relation to the onset of an EPSC that resulted from the stimulation of a presynaptic neuron at 1 Hz. The squares mark the results when the postsynaptic firing times followed the onset of EPSCs within a short time window of 5 ms. The enhancement of synaptic efficiencies demonstrates LTP. The circles mark the results when the postsynaptic neuron was fired 5 ms before the onset of the EPSC. The reduction of synaptic efficiencies demonstrates LTD.

  14. 7.4.2 LTP and LTD • Long-term potentiation(LTP): The amplifications in the synaptic efficiency • Long-term depression (LTD): The reductions in the synaptic efficiency • Whether such synaptic changes can persist for the lifetime of an organism is unknown • Such forms of synaptic plasticity support the basic model of association • LTP can enforce associative response to a presynaptic firing pattern that is temporally linked to postsynaptic firing • LTD can facilitate the unlearning of presynaptic input that is not consistent with postsynaptic firing • The basis of mechanisms of associative memories

  15. 7.4.3 Time window of Hebbian plasticity • The crucial temporal relation between pre- and postsynaptic spikes by varying the time between pre- postsynaptic spikes Fig. 7.4 (B) The relative changes in EPSC amplitudes are shown for various time windows between the onset of an EPSC induced by presynaptic firing and the time of induction of spikes in the postsynaptic neuron

  16. 7.4.4 Variation of temporal Hebbian plasticity • The asymmetric and symmetric form of Hebbian plasticity Fig. 7.5 Several examples of the schematic dependence of synaptic efficiencies on the temporal relations between pre- and postsynaptic spikes

  17. 7.4.5 Dependence of synaptic changes on initial strength • Whether the size of synaptic changes depends on the strength of a synapse • The absolute strength of the synaptic efficiencies in LTD is proportional to the initial synaptic efficiency • The relative changes of EPSC amplitudes for LTP are largest for small initial EPSC amplitudes LTD: LTP: : multiplicative : additive (7.2) (7.3) Fig. 7.6 Dependence of LTP and LTD on the magnitude of the EPSCs before synaptic plasticity is induced.

  18. 7.5 Mathematical formulation of Hebbian plasticity • Synaptic plasticity by a change of weight values • The weight values are not static but can change over time • The variation of weight values after time steps Δt in a discrete fashion as • The dependence of the weight changes on various factors • Activity-dependent synaptic plasticity • Depend on the firing times of the pre- and postsynaptic neuron • The strength of synapse can vary within some interval (7.4)

  19. 7.5.1 Hebbian learning with spiking neurons • The dependence of the synaptic changes (LTP: “+” LTD: “-”) • Kernel function: exponential form • : Threshold function that restricts LTP and LTD to the correct domains • Amplitude factor f± 1. Additive rule with absorbing boundaries, 2. Multiplicative rule with more graded nonlinearity when approaching the boundaries, (7.5) (7.6) (7.7) (7.8) (7.9)

  20. 7.5.2 Hebbian learning in rate models • The average behavior of neurons or cell assemblies (rate models) • Cannot incorporate the spike timing • The plasticity depends on the average correlation of pre- and postsynaptic firing • ri: firing rate of a postsynaptic node i • rj: firing rate of a presynaptic node i • f1: learning rate • f2 and f3: plasticity thresholds • f4: weight decay • The average change of synaptic weights is proportional to the covariance of the pre- and postsynaptic firing (cross-correlation function) : Hebbian plasticity rule (7.10) : Hebbian rule without decay term (7.11) : Covariance of riand rj (7.12)

  21. 7.6 Weight distributions • Synaptic efficiencies are continuously changing as long as learning rules are applied • Problem • Rapid changes of weight can lead to instabilities in the system • The neuron should adapt to rapid changes • A neuron should roughly maintain its main firing rate • Solution • The overall weight distribution stays relatively constant • Hebbian models depend on the form of the weight distribution Fig. 7.7 Distribution of fluorescence intensities of synapses from a spinal neuron that were labeled with fluorescence antibodies, which can be regarded as an estimate of the synaptic efficiencies.

  22. 7.6.1 Example of weight distribution in a rate model • Rate models of recurrent networks trained with the Hebbian training rule on random patterns have Gaussian distribution weight component Fig. 7.8 Normalized histograms of weight values from simulations of a simplified neuron (sigma node) simulating average firing rates after training with the basic Hebbian learning rules 7.11 on exponentially distributed random patterns. A fit of a Gaussian distribution to the data is shown as a solid line.

  23. 7.6.2 Change of synaptic characteristics • Dale’s principle: Neurons make either excitatory or inhibitory synapses • The synapses from a presynaptic neuron cannot change its specific characteristics • The simulations above we did not restrict the synapses to be either inhibitory or excitatory • Weight values can be set to cross the boundaries between positive and negative values, which is physiologically unrealistic • But, simulations with such constraints produce similar results for the distribution of the weight matrix component • Therefore, it is common to relax this biological detail(Dale’s principle) in the simulation

  24. 7.6.3 Examples with spiking neurons • Asymmetric Hebbian rules for spiking neuron Fig. 7.9 (A) Average firing rate (decreasing curve) and Cv, the coefficient of variation (increasing and fluctuating curve), of an IF-neuron that is driven by 1000 excitatory Poisson spike trains while the synaptic efficiencies are changed according to an additive Hebbian rule with asymmetric Gaussian plasticity windows. (B) Distribution of weight values after 5 minutes of simulated training time (which is similar to the distribution after 3 minutes). The weights were limited to be in the range of 0-0.015. The distribution has two maxima, one at each boundary of the allowed interval.

  25. 7.7 Neuronal response variability, gain control, and scaling7.7.1 Variability and gain control • The firing time of the IF-neuron is mainly determined by the average firing input current • Measure this statement using cross-correlation function (7.13) Fig. 7.10 Average cross-correlation function between pre-synaptic Poisson spike trains and the postsynaptic spike train (averaged over all presynaptic spike trains) in simulation of an IF-neuron with 1000 input channels. The spike trains that lead to the results shown by stars were generated with each weight value fixed to value 0.015. The cross-correlations are consistent with zero when considered within the variance indicated by the error bars. The squares represent the simulation results from simulations of the IF-neuron driven by the same presynaptic spike trains as before, but with the weight matrix after Hebbian learning shown in Fig. 7.9. Some presynaptic spike trains caused postsynaptic spiking with a positive peak in the average cross-correlation functions when the presynaptic spikes precede the postsynaptic spike. No error bars are shown for this curve for clarity.

  26. 7.7.2 Synaptic scaling • The dependence of overall synaptic efficiencies on the average postsynaptic firing rate • Crucial to keep the neurons in the regime of high variability • Keep neurons sensitive for information processing in the nervous systems • Many experiments have demonstrated • Synaptic efficiencies are scaled by the average postsynaptic activity • The threshold where LTP is induced can depend on the time-averaged recent activity of the neuron • Weight normalization • Weight decay

  27. 7.7.3 Oja’s rule and principal component • Weight normalization through heterosynaptic depression (7.14) Fig. 7.11 Simulation of a linear node trained with Oja’s rule on training examples (indicated by the dots) drawn from a two-dimensional probability distribution with mean zero. The weight vector with initial conditions indicated by the cross converges to the weight vector (thick arrow), which has length |w| = 1 and points in the direction of the first principal component.

  28. 7.7.4 Short-term synaptic plasticity and neuronal gain control • Short-term synaptic plasticity • Cortical neurons typically have a transient response with a decreasing firing rate to a constant input current • Short-term synaptic depression (STD) • The computational consequences of short-term depression can be manifold • Allows a neuron to respond strongly to input that has not been influencing the neuron recently and therefore has a strong novelty value • Rapid spike trains that would exhaust the neuron can be weakened

  29. 7.8 Features of associators and Hebbian learning • Pattern completion and generalization • Recall from partial input • The output node responds to all patterns with a certain similarity to the trained pattern • Prototypes and extraction of central tendencies • The ability to extract central tendencies • Noise reduction • Graceful degradation • The loss of some components of system should not make the system fail completely. • Fault tolerance

  30. 7.8.4 Biologically faithful learning rules • The associative Hebbian learning rules • Biologically faithful models: • Unsupervised • No specific learning signal • Self-organization rule • Reinforcement learning • Local • Only presynaptic and postsynaptic observable are required to change the synaptic weight values • Benefit from true parallel distributed processing • Online • The learning rule does not require storage of firing patterns or network parameters

  31. Conclusion • What is the associative memory? • Biochemical mechanisms of synaptic plasticity • Hebbian learning rule • Synaptic plasticity • Temporal structure of Hebbian plasticity • LTP and LTD • Weight distribution • Gain control, synaptic scaling, Oja’s rule and PCA • Associators and Hebbian learning • Hebbian learning rule is bilogically faithful learning rules • Unsupervised, local, online

More Related