Loading in 2 Seconds...

A Dynamic System Analysis of Simultaneous Recurrent Neural Network

Loading in 2 Seconds...

- 169 Views
- Uploaded on

Download Presentation
## PowerPoint Slideshow about ' A Dynamic System Analysis of Simultaneous Recurrent Neural Network' - travis-collins

**An Image/Link below is provided (as is) to download presentation**

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

### A Dynamic System Analysis of Simultaneous Recurrent Neural Network

Dr. Gursel Serpen and Mr. Yifeng Xu

Electrical Engineering and Computer Science Department

University of Toledo

Toledo, Ohio, USA

Introduction

- Motivation for research
- Simultaneous Recurrent Neural Network (SRN)
- Local Stability of SRN
- Global stability of SRN

Werbos, Paul many recent articles – some unpublished!

Serpen et al., The Simultaneous Recurrent Neural Network for Addressing the Scaling Problem in Static Optimization, Neural Systems, Vol. 11, No. 5, 2001, pp. 477-487.

FOR MORE INFO...

Motivation

- Simultaneous Recurrent Neural (SRN) network has been shown to have the potential to address large-scale static optimization problems: located relatively high quality solutions.
- SRN is trainable: implies that it can learn from prior search attempts (A Hopfield net cannot do this!)
- Computational complexity for SRN simulations is much less again compared to the Hopfield and its derivatives.

Research Goals

- Understand the stability and convergence properties of the SRN dynamics.
- Establish stable dynamics following initialization.
- Establish stability while training the SRN with a fixed-point algorithm (recurrent backprop).
- Apply the SRN to (large-scale) static optimization problems.

Research Goals - detailed

Initialization of weights to guarantee existence of at least one or more fixed points in the state space of the SRN dynamics.

Stability as weight matrices are being modified while learning with a fixed-point algorithm, i.e., recurrent backpropagation.

Assessing the computational power of the SRN as a static optimizer for large-scale problem instances.

Simultaneous Recurrent Neural Network

Topology

Feedforward Network

Input

Output

Typical propagation delay exists on the feedback path as dictated by physical constraints!

Delayed Feedback

Analysis of SRN Dynamics

Local stability analysis

SRN dynamics linearized at hypercube corners (that are also equilibrium points)

(Local) stability conditions for hypercube corner equilibrium points

A theorem...its proof (elsewhere)

Equilibrium Points in State Space of SRN Dynamics

SRN dynamics

Output layer

Hidden layer

SRN dynamics in matrix form in terms of z:

Equilibrium Points of SRN Dynamics

Equilibrium Points of SRN Dynamics

Hypercube Corners

zk = 1 or zk = 0

for k = 1, 2, …, K

Points interior, on the surface, and on the edges of the hypercube

Stability of SRN Dynamics - Linearized

Eigenvalues of SRN dynamics linearized at hypercube corners (equilibrium points):

for k = 1, 2, … , K.

Stability of SRN Dynamics - linearized

Set of inequalities derived from eigenvalues through stability condition for equilibrium points.

= 0

for

= 1

for

A Stability Theorem for SRN Dynamics

For any given hypercube corner in the state space of SRN dynamics,

which is configured for static combinatorial optimization, i.e. with high-gain neurons, one hidden layer, one output layer, no external input, and operating in associative memory mode,

it is possible to define the weight matrices U and V to establish that hypercube corner as a stable equilibrium point in a local sense.

Simulation Study

Case 1 - 2X4 SRN

Methodology: Create 75 instances of U & V then, observe set of stable

equilibrium points thru multiple random initializations.

Fixed points (hypercube corners) 48

Limit cycles - cycle length of 2 12

Stable non-hypercube points 19

* all 16 hypercube corners appeared as

stable equilibrium points for 75 instances

of weight matrices.

Case 2 - 2X100 SRN

Case 3 - 5X10000 SRN

Case 4 - 5X25000 SRN

Global Stability of SRN?

- The SRN paradigm is closely related to a number of globally asymptotically stable recurrent neural network algorithms!
- BAM is a special case of SRN.
- ART 1 & 2 cores are similar topologies.
- Significant simulation-based empirical studies conducted by authors suggest global stability.
- However, no Liapunov function yet!

Conclusions

- A theorem demonstrates that there exist real forward and backward weight matrices for SRN dynamics, which will induce stability of any given hypercube corner as an equilibrium point.
- This theoretical finding was also validated by extensive simulation-based empirical studies, some of which was reported in a recent journal article:

Serpen et al., The Simultaneous Recurrent Neural Network for Addressing the Scaling Problem in Static Optimization, Neural Systems, Vol. 11, No. 5, 2001, pp. 477-487.

Thank You !

- Questions ?

This research has been funded in part by the US National Science Foundation grant ECS-9800247.

Download Presentation

Connecting to Server..