1 / 18

Associative Memory by Recurrent Neural Networks with Delay Elements

This study investigates the associative memory model in recurrent neural networks (RNNs) incorporating delayed synapses, a critical aspect of real neural systems. While computer simulations offer insights, they face limitations with extensive neuron networks and large delay steps. We employ theoretical and analytical techniques to derive macroscopic steady-state equations using discrete Fourier transformation, allowing us to quantitatively discuss storage capacity even in large delay limits. Findings indicate computational complexity of O(L⁴t) and authentic agreement with simulation results, enhancing understanding of memory dynamics in delayed networks.

bjorn
Download Presentation

Associative Memory by Recurrent Neural Networks with Delay Elements

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Associative Memory by Recurrent Neural Networks with Delay Elements Seiji MIYOSHI Hiro-Fumi YANAI Masato OKADA Kobe City College of Tech. Ibaraki Univ.RIKEN BSI , ERATO KDB JAPAN JAPAN JAPAN miyoshi@kobe-kosen.ac.jp www.kobe-kosen.ac.jp/~miyoshi/

  2. Background • It is very important to analyze associative memory model with delayed synapses. • Synapses of real neural systems seem to have delays. • Computer simulation is powerful method. However, There is a Limit on the number of neurons. Simulating network with large delay steps is realistically impossible. • Theoretical and analytical approach is indispensable to research on delayed networks. • Yanai-Kim theory by using Statistical Neurodynamics Good Agreement with computer simulation Computational Complexity is O(L4t)

  3. Objective • To derive macroscopic steady state equations by using discrete Fourier transformation • To discuss storage capacity quantitatively even for a large L limit (L: length of delay)

  4. Model Recurrent Neural Network with Delay Elements

  5. Discrete Synchronous Updating Rule • Correlation Learning for Sequence Processing • Overlap Model

  6. Macrodynamical Equations by Statistical NeurodynamicsYanai & Kim(1995)Miyoshi, Yanai & Okada(2002)

  7. One Step Set Initial Condition Only the states of neurons are set explicitly. The states of delay elements are set to be zero. set zero set Initial Condition of the Network All Steps Set Initial Condition • The states of all neurons and all delay elements are set to be close to the stored pattern sequences. • If they are set to be the stored pattern sequences themselves ≡ Optimum Initial Condition

  8. Dynamical Behaviors of Recall Process All Steps Set Intial Condition Loadingrateα=0.5Length of delayL=3 Theory Simulation(N=2000)

  9. Dynamical Behaviors of Recall Process All Steps Set Intial Condition Loadingrateα=0.5Length of delayL=2 Theory Simulation(N=2000)

  10. Loading rates α - Steady State Overlaps m Theory Simulation(N=500)

  11. Length of delay L - Critical Loading Rate αC

  12. Macrodynamical Equations by Statistical NeurodynamicsYanai & Kim(1995)Miyoshi, Yanai & Okada(2002) • Good Agreement with Computer Simulation • Computational Complexity is O(L4t)

  13. Macroscopic Steady State Equations • Accounting for Steady State • Parallel Symmetry in terms of Time Steps • Discrete Fourier Transformation

  14. Loading rates α - Steady State Overlaps m

  15. Loading rates α - Steady State Overlaps m Theory Simulation(N=500)

  16. Loading rate α - Steady State Overlap

  17. Storage Capacity of Delayed Network Storage Capacity = 0.195 L

  18. Conclusions → Computational Complexity is O(L4t) • Yanai-Kim theory (macrodynamical equations for delayed network) is re-derived. → Intractable to discuss macroscopic properties in a large L limit • Steady state equations are derived by using discrete Fourier transformation. → Computational complexity does not formally depend on L → Phase transition points agree with those under the optimum initial conditions, that is, the Storage Capacities ! • Storage capacity is 0.195 L in a large L limit.

More Related