1 / 70

Memory

Memory. Hopfield Network. Content addressable Attractor network. Hopfield Network. Hopfield Network. General Case: Lyapunov function. Neurophysiology. Mean Field Approximation. Null Cline Analysis. What are the fixed points?. E. I. C I. C E. Null Cline Analysis.

Mia_John
Download Presentation

Memory

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Memory

  2. Hopfield Network • Content addressable • Attractor network

  3. Hopfield Network

  4. Hopfield Network • General Case: • Lyapunov function

  5. Neurophysiology

  6. Mean Field Approximation

  7. Null Cline Analysis • What are the fixed points? E I CI CE

  8. Null Cline Analysis • What are the fixed points?

  9. I I E Null Cline Analysis Unstable fixed point E Stable fixed point

  10. I Null Cline Analysis E E

  11. Null Cline Analysis I E E

  12. Null Cline Analysis I E E

  13. Null Cline Analysis I E E

  14. Null Cline Analysis Stable branches I Unstable branch E E

  15. Null Cline Analysis I E E

  16. Null Cline Analysis I Stable fixed point I E

  17. E Null Cline Analysis I I E

  18. E Null Cline Analysis I I E

  19. Null Cline Analysis Inhibitory null cline I Excitatory null cline E Fixed points

  20. E I CI CE Binary Memory I E

  21. E I CI CE Binary Memory Storing I Decrease inhibition (CI) E

  22. E I CI CE Binary Memory Storing I Back to rest E

  23. E I CI CE Binary Memory Reset I Increase inhibition E

  24. E I CI CE Binary Memory Reset I Back to rest E

  25. Networks of Spiking Neurons • Problems with the previous approach: • Spiking neurons have monotonic I-f curves (which saturate, but only at very high firing rates) • How do you store more than one memory? • What is the role of spontaneous activity?

  26. Networks of Spiking Neurons

  27. Networks of Spiking Neurons Ij R(Ij)

  28. Networks of Spiking Neurons

  29. Networks of Spiking Neurons • A memory network must be able to store a value in the absence of any input:

  30. Networks of Spiking Neurons

  31. Networks of Spiking Neurons cR(Ii) Ii Iaff

  32. Networks of Spiking Neurons • With a non saturating activation function and no inhibition, the neurons must be spontaneously active for the network to admit a non zero stable state: cR(Ii) I2* Ii

  33. Networks of Spiking Neurons • To get several stable fixed points, we need inhibition: Unstable fixed point Stable fixed points I2* Ii

  34. Networks of Spiking Neurons • Clamping the input: inhibitory Iaff Ii Iaff

  35. Networks of Spiking Neurons • Clamping the input: excitatory Iaff cR(Ii) Ii I2* Iaff

  36. Networks of Spiking Neurons Ij R(Ij)

  37. Networks of Spiking Neurons • Major Problem: the memory state has a high firing rate and the resting state is at zero. In reality, there is spontaneous activity at 0-10Hz and the memory state is around 10-20Hz (not 100Hz) • Solution: you don’t want to know (but it involves a careful balance of excitation and inhibition)…

  38. Line Attractor Networks • Continuous attractor: line attractor or N-dimensional attractor • Useful for storing analog values • Unfortunately, it’s virtually impossible to get a neuron to store a value proportional to its activity

  39. Line Attractor Networks • Storing analog values: difficult with this scheme…. cR(Ii) Ii

  40. Line Attractor Networks Implication for transmitting rate and integration… cR(Ii) Ii

  41. Line Attractor Networks • Head direction cells DH 100 80 60 Activity 40 20 0 -100 0 100 Preferred Head Direction (deg)

  42. Line Attractor Networks • Attractor network with population code • Translation invariant weights DH 100 80 60 Activity 40 20 0 -100 0 100 Preferred Head Direction (deg)

  43. Line Attractor Networks • Computing the weights:

  44. Line Attractor Networks • The problem with the previous approach is that the weights tend to oscillate. Instead, we minimize: • The solution is:

  45. Line Attractor Networks • Updating of memory: bias in the weights, integrator of velocity…etc.

  46. Line Attractor Networks • How do we know that the fixed points are stable? With symmetric weights, the network has a Lyapunov function (Cohen, Grossberg 1982):

  47. Line Attractor Networks • Line attractor: the set of stable points forms a line in activity space. • Limitations: Requires symmetric weights… • Neutrally stable along the attractor: unavoidable drift

  48. Memorized Saccades + + T1 T2

  49. Memorized Saccades + + R1 R2 T1 T2 S1 R2 S2 S1=R1 S2=R2-S1

  50. Memorized Saccades + + R1 R2 T1 T2 S1 S2 S1 T1 T2 S2

More Related