1 / 16

Computational Intelligence Winter Term 2011/12

Computational Intelligence Winter Term 2011/12. Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund. Plan for Today. Bidirectional Associative Memory (BAM) Fixed Points Concept of Energy Function

ursa
Download Presentation

Computational Intelligence Winter Term 2011/12

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ComputationalIntelligence Winter Term 2011/12 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund

  2. Plan for Today • BidirectionalAssociative Memory (BAM) • Fixed Points • ConceptofEnergyFunction • Stable States = MinimizersofEnergyFunction • Hopfield Network • Convergence • ApplicationtoCombinatorialOptimization G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 2

  3. w11 x1 y1 x2 y2 x3 y3 ... ... ... ... wnk xn yk Bidirectional Associative Memory (BAM) Network Model • fully connected • bidirectional edges • synchonized: • step t : data flow from x to y step t + 1 : data flow from y to x start: y(0) = sgn(x(0) W) x(1) = sgn(y(0) W‘) y(1) = sgn(x(1) W) x(2) = sgn(y(1) W‘) ... x, y : row vectors W : weight matrix W‘ : transpose of W bipolar inputs 2 {-1,+1} G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 3

  4. > 0 (does not alter sign) > 0 (does not alter sign) Bidirectional Associative Memory (BAM) Fixed Points Definition (x, y) is fixed point of BAM iff y = sgn(x W) and x‘ = sgn(W y‘). □ Set W = x‘ y. (note: x is row vector) y = sgn( x W ) = sgn( x (x‘ y) ) = sgn( (x x‘) y) = sgn( || x ||2 y) = y x‘ = sgn( W y‘) = sgn( (x‘y) y‘ ) = sgn( x‘ (y y‘) ) = sgn( x‘ || y ||2 ) = x‘ Theorem: If W = x‘y then (x,y) is fixed point of BAM. □ G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 4

  5. true if e‘ close to x(0) 1 x(0) = (1, 1) 0 1 Bidirectional Associative Memory (BAM) Concept of Energy Function given: BAM with W = x‘y ) (x,y) is stable state of BAM starting point x(0)) y(0) = sgn( x(0) W ) ) excitation e‘ = W (y(0))‘ ) if sign( e‘ ) = x(0) then ( x(0) , y(0) ) stable state small angle between e‘ and x(0) ) 1 small angle ) large cos(  ) recall: 0 G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 5

  6. 12 – Definition Energy function of BAM at iteration t is E( x(t) , y(t) ) = – x(t) W y(t) ‘ □ Bidirectional Associative Memory (BAM) Concept of Energy Function required: small angle between e‘ = W y(0) ‘ and x(0) ) larger cosine of angle indicates greater similarity of vectors ) 8e‘ of equal size: try to maximize x(0) e‘ = || x(0) || ¢ || e || ¢ cos Å (x(0) ,e) fixed fixed → max! ) maximize x(0) e‘ = x(0) W y(0) ‘ ) identical to minimize -x(0) W y(0) ‘ G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 6

  7. excitations Bidirectional Associative Memory (BAM) Stable States Theorem An asynchronous BAM with arbitrary weight matrix W reaches steady state in a finite number of updates. Proof: BAM asynchronous ) select neuron at random from left or right layer, compute its excitation and change state if necessary (states of other neurons not affected) G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 7

  8. neuron i of left layer has changed ) sgn(xi) ≠ sgn(bi) ) xi was updated to xi = –xi ~ Bidirectional Associative Memory (BAM) ~ < 0 use analogous argumentation if neuron of right layer has changed ) every update (change of state) decreases energy function ) since number of different bipolar vectors is finite update stops after finite #updates remark: dynamics of BAM get stable in local minimum of energy function! q.e.d. G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 8

  9. energy of state x is 1 1 transition: select index k at random, new state is 2 where 2 3 3 Hopfield Network special case of BAM but proposed earlier (1982) • characterization: • neurons preserve state until selected at random for update • n neurons fully connected • symmetric weight matrix • no self-loops (→ zero main diagonal entries) • thresholds  , neuron i fires if excitations larger than i G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 9

  10. Proof: assume that xk has been updated ) Hopfield Network Theorem: Hopfield network converges to local minimum of energy function after a finite number of updates. □ = 0 if i ≠ k G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 10

  11. Hopfield Network = 0 if j ≠ k (rename j to i, recall W = W‘, wkk = 0) > 0 since: excitation ek q.e.d. > 0 if xk < 0 and vice versa G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 11

  12. Hopfield Network Application to Combinatorial Optimization • Idea: • transform combinatorial optimization problem as objective function with x 2 {-1,+1} n • rearrange objective function to look like a Hopfield energy function • extract weights W and thresholds  from this energy function • initialize a Hopfield net with these parameters W and  • run the Hopfield net until reaching stable state (= local minimizer of energy function) • stable state is local minimizer of combinatorial optimization problem G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 12

  13. Hopfield Network Example I: Linear Functions Evidently: ) ) fixed point reached after (n log n) iterations on average G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 13

  14. Hopfield Network Example II: MAXCUT given: graph with n nodes and symmetric weights ij = ji , ii = 0, on edges task: find a partition V = (V0, V1) of the nodes such that the weighted sum of edges with one endpoint in V0 and one endpoint in V1 becomes maximal encoding:8 i=1,...,n: yi = 0 , node i in set V0; yi = 1 , node i in set V1 objective function: preparations for applying Hopfield network step 1: conversion to minimization problem step 2: transformation of variables step 3: transformation to “Hopfield normal form“ step 4: extract coefficients as weights and thresholds of Hopfield net G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 14

  15. Hopfield Network Example II: MAXCUT (continued) step 1: conversion to minimization problem ) multiply function with -1 ) E(y) = -f(y) → min! step 2:transformationof variables )yi= (xi+1) / 2 ) constant value (does not affect location of optimal solution) G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 15

  16. Hopfield Network Example II: MAXCUT (continued) step 3: transformation to “Hopfield normal form“ wij 0‘ step 4: extract coefficients as weights and thresholds of Hopfield net remark: G. Rudolph: ComputationalIntelligence▪ Winter Term 2011/12 16

More Related