180 likes | 343 Views
This study investigates the associative memory model in recurrent neural networks (RNNs) incorporating delayed synapses, a critical aspect of real neural systems. While computer simulations offer insights, they face limitations with extensive neuron networks and large delay steps. We employ theoretical and analytical techniques to derive macroscopic steady-state equations using discrete Fourier transformation, allowing us to quantitatively discuss storage capacity even in large delay limits. Findings indicate computational complexity of O(L⁴t) and authentic agreement with simulation results, enhancing understanding of memory dynamics in delayed networks.
E N D
Associative Memory by Recurrent Neural Networks with Delay Elements Seiji MIYOSHI Hiro-Fumi YANAI Masato OKADA Kobe City College of Tech. Ibaraki Univ.RIKEN BSI , ERATO KDB JAPAN JAPAN JAPAN miyoshi@kobe-kosen.ac.jp www.kobe-kosen.ac.jp/~miyoshi/
Background • It is very important to analyze associative memory model with delayed synapses. • Synapses of real neural systems seem to have delays. • Computer simulation is powerful method. However, There is a Limit on the number of neurons. Simulating network with large delay steps is realistically impossible. • Theoretical and analytical approach is indispensable to research on delayed networks. • Yanai-Kim theory by using Statistical Neurodynamics Good Agreement with computer simulation Computational Complexity is O(L4t)
Objective • To derive macroscopic steady state equations by using discrete Fourier transformation • To discuss storage capacity quantitatively even for a large L limit (L: length of delay)
Model Recurrent Neural Network with Delay Elements
Discrete Synchronous Updating Rule • Correlation Learning for Sequence Processing • Overlap Model
Macrodynamical Equations by Statistical NeurodynamicsYanai & Kim(1995)Miyoshi, Yanai & Okada(2002)
One Step Set Initial Condition Only the states of neurons are set explicitly. The states of delay elements are set to be zero. set zero set Initial Condition of the Network All Steps Set Initial Condition • The states of all neurons and all delay elements are set to be close to the stored pattern sequences. • If they are set to be the stored pattern sequences themselves ≡ Optimum Initial Condition
Dynamical Behaviors of Recall Process All Steps Set Intial Condition Loadingrateα=0.5Length of delayL=3 Theory Simulation(N=2000)
Dynamical Behaviors of Recall Process All Steps Set Intial Condition Loadingrateα=0.5Length of delayL=2 Theory Simulation(N=2000)
Loading rates α - Steady State Overlaps m Theory Simulation(N=500)
Macrodynamical Equations by Statistical NeurodynamicsYanai & Kim(1995)Miyoshi, Yanai & Okada(2002) • Good Agreement with Computer Simulation • Computational Complexity is O(L4t)
Macroscopic Steady State Equations • Accounting for Steady State • Parallel Symmetry in terms of Time Steps • Discrete Fourier Transformation
Loading rates α - Steady State Overlaps m Theory Simulation(N=500)
Storage Capacity of Delayed Network Storage Capacity = 0.195 L
Conclusions → Computational Complexity is O(L4t) • Yanai-Kim theory (macrodynamical equations for delayed network) is re-derived. → Intractable to discuss macroscopic properties in a large L limit • Steady state equations are derived by using discrete Fourier transformation. → Computational complexity does not formally depend on L → Phase transition points agree with those under the optimum initial conditions, that is, the Storage Capacities ! • Storage capacity is 0.195 L in a large L limit.