1 / 60

Feedback Networks and Associative Memories

Feedback Networks and Associative Memories. 虞台文. 大同大學資工所 智慧型多媒體研究室. Content. Introduction Discrete Hopfield NNs Continuous Hopfield NNs Associative Memories Hopfield Memory Bidirection Memory. Feedback Networks and Associative Memories. Introduction. 大同大學資工所 智慧型多媒體研究室.

roddy
Download Presentation

Feedback Networks and Associative Memories

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Feedback Networksand Associative Memories 虞台文 大同大學資工所 智慧型多媒體研究室

  2. Content • Introduction • Discrete Hopfield NNs • Continuous Hopfield NNs • Associative Memories • Hopfield Memory • Bidirection Memory

  3. Feedback Networksand Associative Memories Introduction 大同大學資工所 智慧型多媒體研究室

  4. Feedforward/Feedback NNs • Feedforward NNs • The connections between units do not form cycles. • Usually produce a response to an input quickly. • Most feedforward NNs can be trained using a wide variety of efficient algorithms. • Feedback or recurrent NNs • There are cycles in the connections. • In some feedback NNs, each time an input is presented, the NN must iterate for a potentially long time before it produces a response. • Usually more difficultto train than feedforward NNs.

  5. Supervised-Learning NNs • Feedforward NNs • Perceptron • Adaline, Madaline • Backpropagation (BP) • Artmap • Learning Vector Quantization (LVQ) • Probabilistic Neural Network (PNN) • General Regression Neural Network (GRNN) • Feedback or recurrent NNs • Brain-State-in-a-Box (BSB) • Fuzzy Congitive Map (FCM) • Boltzmann Machine (BM) • Backpropagation through time (BPTT)

  6. Unsupervised-Learning NNs • Feedforward NNs • Learning Matrix (LM) • Sparse Distributed Associative Memory (SDM) • Fuzzy Associative Memory (FAM) • Counterprogation (CPN) • Feedback or Recurrent NNs • Binary Adaptive Resonance Theory (ART1) • Analog Adaptive Resonance Theory (ART2, ART2a) • Discrete Hopfield (DH) • Continuous Hopfield (CH) • Discrete Bidirectional Associative Memory (BAM) • Kohonen Self-organizing Map/Topology-preserving map (SOM/TPM)

  7. The Hopfield NNs • In 1982, Hopfield, a Caltech physicist, mathematically tied together many of the ideas from previous research. • A fully connected, symmetrically weighted network where each node functions both as input and output node. • Used for • Associated memories • Combinatorial optimization

  8. Associative Memories • An associative memory is a content-addressable structure that maps a set of input patterns to a set of output patterns. • Two types of associative memory: autoassociative and heteroassociative. • Auto-association • retrieves a previously stored pattern that most closely resembles the current pattern. • Hetero-association • the retrieved pattern is, in general, different from the input pattern not only in content but possibly also in type and format.

  9. A memory memory Associative Memories Auto-association A Hetero-association Niagara Waterfall

  10. Optimization Problems • Associate costs with energy functions in Hopfield Networks • Need to be in quadratic form • Hopfield Network finds local, satisfactory soluions, doesn’t choose solutions from a set. • Local optimums, not global.

  11. Feedback Networksand Associative Memories Discrete Hopfield NNs 大同大學資工所 智慧型多媒體研究室

  12. w1n w2n w3n w13 w23 wn3 w12 w32 wn2 w21 w31 wn1 . . . 1 2 3 n I1 I2 I3 In v1 v2 v3 vn The Discrete Hopfield NNs

  13. w1n w2n w3n w13 w23 wn3 w12 w32 wn2 w21 w31 wn1 . . . 1 2 3 n I1 I2 I3 In v1 v2 v3 vn wij =wji wii =0 The Discrete Hopfield NNs

  14. w1n w2n w3n w13 w23 wn3 w12 w32 wn2 w21 w31 wn1 . . . 1 2 3 n I1 I2 I3 In v1 v2 v3 vn wij =wji wii =0 The Discrete Hopfield NNs

  15. State Update Rule • Asynchronous mode • Update rule Stable?

  16. Energy Function Fact:E is lower bounded (upper bounded). If E is monotonicallydecreasing (increasing), the system is stable.

  17. The Proof Suppose that at time t + 1, the kth neuron is selected for update.

  18. Their values are not changed at time t + 1. Their values are not changed at time t + 1. The Proof Suppose that at time t + 1, the kth neuron is selected for update.

  19. The Proof

  20. vk(t) Hk(t+1) vk(t+1) E E The Proof Stable 1  0 0 1 1 1 < 0 < 0 1  0 < 0 1 1 1 0 < 0

  21. Feedback Networksand Associative Memories Continuous Hopfield NNs 大同大學資工所 智慧型多媒體研究室

  22. vi=a(ui) Ii wi1 1 v1 wi2 ui v2 1 . . . vn win ui=a1(vi) ui gi Ci 1 vi 1 vi vi The Neuron of Continuous Hopfield NNs

  23. Ii wi1 v1 wi2 v2 . . . vn win ui gi Ci vi vi The Dynamics Gi

  24. I1 I2 I3 In w1n w3n w2n wn3 w13 w23 w12 w32 wn2 w31 w21 wn1 u1 u2 u3 un C1 g1 C2 g2 C3 g3 Cn gn . . . . . . v1 v2 v3 vn v1 v2 v3 vn The Continuous Hopfield NNs

  25. I1 I2 I3 In w1n w3n w2n wn3 w13 w23 w12 w32 wn2 w31 w21 wn1 u1 u2 u3 un C1 g1 C2 g2 C3 g3 Cn gn . . . . . . v1 v2 v3 vn v1 v2 v3 vn The Continuous Hopfield NNs Stable?

  26. Equilibrium Points • Consider the autonomous system: • Equilibrium Points Satisfy

  27. CallE(y) as energy function. Lyapunov Theorem The system is asymptotically stable if the following holds: There exists a positive-definite function E(y) s.t.

  28. I1 I2 I3 In w1n w3n w2n wn3 w13 w23 w12 w32 wn2 w31 w21 wn1 u1 u2 u3 un C1 g1 C2 g2 C3 g3 Cn gn . . . . . . v1 v2 v3 vn v1 v2 v3 vn Lyapunov Energy Function

  29. u=a1(v) vi=a(ui) 1 1 v ui 1 1 v 1 1 Lyapunov Energy Function I1 I2 I3 In w1n w3n w2n wn3 w13 w23 w12 w32 wn2 w31 w21 wn1 u1 u2 u3 un C1 g1 C2 g2 C3 g3 Cn gn . . . . . . v1 v2 v3 vn v1 v2 v3 vn

  30. Dynamics Stability of Continuous Hopfield NNs

  31. u=a1(v) 1 v 1 Dynamics Stability of Continuous Hopfield NNs > 0

  32. I1 I2 I3 In w1n w3n w2n wn3 w13 w23 w12 w32 wn2 w31 w21 wn1 u1 u2 u3 un C1 g1 C2 g2 C3 g3 Cn gn . . . . . . v1 v2 v3 vn v1 v2 v3 vn Stability of Continuous Hopfield NNs Stable

  33. Basins of Attraction

  34. Basins of Attraction

  35. Local/Global Minima Energy Landscape

  36. Feedback Networksand Associative Memories Associative Memories 大同大學資工所 智慧型多媒體研究室

  37. Associative Memories • Also named content-addressable memory. • Autoassociative Memory • Hopfield Memory • Heteroassociative Memory • Bidirection Associative Memory (BAM)

  38. Associative Memories Stored Patterns Autoassociative Heteroassociative

  39. Feedback Networksand Associative Memories Associative Memories Hopfield Memory Bidirection Memory 大同大學資工所 智慧型多媒體研究室

  40. Hopfield Memory Fully connected 14,400 weights 1210 Neurons

  41. Memory Association Stored Patterns Example

  42. Stored Patterns Example Memory Association

  43. Stored Patterns Example How to Store Patterns? Memory Association

  44. The Storage Algorithm Suppose the set of stored pattern is of dimension n. =?

  45. The Storage Algorithm

  46. Analysis Suppose thatx xi.

  47. Example

  48. 2 2 1 2 3 4 Example

  49. 1 1 1 1 2 2 1 2 3 4 1 1 1 1 1 1 1 1 Example E=4 Stable E=0 E=4

  50. 1 1 1 1 2 2 1 2 3 4 1 1 1 1 1 1 1 1 Example E=4 Stable E=0 E=4

More Related