1 / 17

Adaptive Hopfield Network

Adaptive Hopfield Network. Dr. Gürsel Serpen Associate Professor Electrical Engineering and Computer Science Department University of Toledo Toledo, Ohio, USA. Presentation Topics. Motivation for research Classical Hopfield network (HN) Adaptation – Gradient Descent

Download Presentation

Adaptive Hopfield Network

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Adaptive Hopfield Network Dr. Gürsel Serpen Associate Professor Electrical Engineering and Computer Science Department University of Toledo Toledo, Ohio, USA

  2. Presentation Topics • Motivation for research • Classical Hopfield network (HN) • Adaptation – Gradient Descent • Adaptive Hopfield Network (AHN) • Static Optimization with AHN • Results and Conclusions Serpen et al., Upcoming Journal Article(Insallah!) http://www.eecs.utoledo.edu/~serpen FOR MORE INFO...

  3. Motivation • Classical Hopfield neural network (HN) has been shown to have the potential to address a very large spectrum of static optimization problems. • Classical HN is NOT trainable: implies that it can NOT learn from prior search attempts. • A hardware realization of the Hopfield network is very attractivefor real-time, embeddedcomputing environments. • Is there a way (e.g., training or adaptation) to incorporate the experience (gained as a result of prior search attempts)into the network dynamics (weights)to help the network focus on promising regions of the overall search space?

  4. Research Goals • Propose gradient-descent based procedures to “adapt” weights and constraint weighting coefficients of HN. • Develop an indirect procedure to define “pseudo” values for desired neuron outputs (much like the way desired output values for hidden layer neurons in an MLP). • Develop space-efficient schemes to store the symmetric weight matrix (upper/lower triangular) for large-scale problem instances. • Apply (through simulation) the adaptive HN algorithm to (large-scale) static optimization problems.

  5. Classical Hopfield Net Dynamics Number of Neurons Neuron Dynamics Sigmoid function

  6. Weights (interconnection) - Redefined Liapunov Function Generic Decomposed Weights Defined

  7. Adaptive Hopfield NetBlock Diagram

  8. Adaptive Hopfield NetPseudoCode • Initialization • Initialize network constraint weighting coefficients. • Initialize weights. • Initialize Hopfield net neuron outputs (randomly). • Adaptive Search Relaxation • Relax Hopfield dynamics until convergence to a fixed point. Adaptation • Relax Adjoint network until convergence to a fixed point. • Update weights. • Update constraint weighting coefficients. • Termination Criteria • if not satisfied, continue with Adaptive Search.

  9. Hopfield Network Relaxation

  10. Adaptation of WeightsAdjoint Hopfield Network Adjoint Network

  11. Adaptation of WeightsRecurrent BackProp Weight Update – Recurrent BackProp

  12. AdaptationConstraint Weighting Coefficients Gradient Descent Adaptation Rule Error Function – Problem Specific and Redefined

  13. AdaptationConstraint Weighting Coefficients Partial Derivative – Readily Computable Final Form of Coefficient Update Rule

  14. Mapping A Static Optimization Problem Generic Partial Problem-Specific Partial

  15. Simulation Study • Traveling Salesman Problem • A preliminary work at this time • Up to 100 cities performed • Computing Resources – Ohio Supercomputing Center • Preliminary findings suggest that the theoretical framework is sound and projections are valid • Computational cost (weight matrix size) poses significant challenge for simulation purposes – on going research effort • Currently in progress

  16. Conclusions • An adaptation mechanism, which modifies constraint weighting coefficient parameter values and weights of the classical Hopfield network, was proposed. • A mathematical characterization of the adaptive Hopfield network was presented. • Preliminary simulation results suggest the proposed adaptation mechanism to be effective in guiding the Hopfield network towards high-quality feasible solutions of large-scale static optimization problems. • We are also exploring incorporating a computationally viable stochastic search mechanism to further improve quality of solutions computed by the adaptive Hopfield network while preserving parallel computation capability.

  17. Thank You ! • Questions ? We gratefully acknowledge the computing resources grant provided by the State of Ohio Supercomputing Center (in USA) in facilitating the simulation study. We appreciate the support provided by the Kohler Internationalization Awards Program at the University of Toledo to facilitate this conference presentation.

More Related