1 / 17

Slide 1

Lecture 2 ASSOCIATIONS, RULES, AND MACHINES Victor Eliashberg Consulting professor, Stanford University, Department of Electrical Engineering. Slide 1. SCIENTIFIC / EGINEERING APPROACH. External system ( W,D ). Computing system, B , simulating the work of human nervous system.

mendel
Download Presentation

Slide 1

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lecture 2 ASSOCIATIONS, RULES, AND MACHINES Victor Eliashberg Consulting professor, Stanford University, Department of Electrical Engineering Slide 1

  2. SCIENTIFIC / EGINEERING APPROACH External system (W,D) Computing system, B, simulating the work of human nervous system Sensorimotor devices, D B D W Human-like robot (D,B) External world, W “When you have eliminated the impossible, whatever remains, however improbable, must be the truth.” (Sherlock Holmes) Slide 2

  3. ZERO-APPROXIMATION MODEL s(ν) s(ν+1) Slide 3

  4. BIOLOGICAL INTERPRETATION Working memory, episodic memory, and mental imagery Motor control AM AS Slide 4

  5. y X X11 X12 AM sel y NM.y 0 1 NM symbol read move type symbol Teacher current state of mind next state of mind PROBLEM 1:LEARNING TO SIMULATE the Teacher This problem issimple: system AM needs to learn a manageable number of fixed rules. Slide 5

  6. PROBLEM 2: LEARNING TO SIMULATE EXTERNAL SYSTEMThis problem ishard: the number of fixed rules needed to represent a RAM with n locations explodes exponentially with n. y 1 2 NS NOTE. System (W,D) shown in slide 3 has the properties of a random access memory (RAM). Slide 6

  7. Programmable logic array (PLA): a logic implementation of a local associative memory (solves problem 1 from slide 5) Slide 7

  8. BASIC CONCEPTS FROM THE AREA OF ARTIFICIAL NEURAL NETWORKS Slide 8

  9. Typical neuron Neuron is a very specialized cell. There are several types of neurons with different shapes and different types of membrane proteins. Biological neuron is a complex functional unit. However, it is helpful to start with a simple artificial neuron (next slide). Slide 9

  10. Neuron as the first-order linear threshold element: Inputs:xk R’ Parameters: g1,… gm R’ R’is the set of real non-negativenumbers Output: yR’ xk xm Equations: x1 gk m du g1 Σgkxk gm τ + u = (1) dt k=1 u y=L( u ) (2) where, { u if u > 0 (3) y L( u) = 0 otherwise A more convenient notation xk is the k-th component of input vector g1 x1 gk is the gain (weight) of the k-th synapse gk xk m gm y=L( u ) xm Σgkxk is the total postsynaptic current s = s k=1 τ u is the postsynaptic potential u y is the neuron output u 0 τis the time constant of the neuron y Slide 10

  11. Input synaptic matrix, input long-term memory (ILTM) and DECODING ILTM gx1k gxnk gxik x1 xk DECODING (computing similarity) x xm s1 si sn si sn s1 An abstract representation of (1): m Σgxikxk (2) fdec: X × Gx S si = (1) i=1,…n k=1 Notation: x=(x1, .. xm)are thesignals from input neurons (not shown) gx = (gxik) i=1,…n, k=1,…m is the matrix of synaptic gains -- we postulate that this matrix represents input long-term memory (ILTM) s=(s1, .. sn)is thesimilarity function Slide 11

  12. Layer with inhibitory connections as the mechanism of the winner-take-all (WTA) choice s1 si sn xinh q α α α ui un u1 Equations: τ τ τ (1) β β β dn di d1 (2) Note. Small white and black circles represent excitatory andinhibitorysynapses, respectively. (3) s1 sn si Procedural representation: RANDOM CHOICE iwin : { i / si=max sj > 0 } (4) ( j ) if (i == iwin) di=1; else di=0; (5) iwin “: “ denotes random equally probable choice Slide 12

  13. Output synaptic matrix, output long-term memory (OLTM) and ENCODING di dn d1 dn d1 di y1 y ENCODING (data retrieval) yk gyki gykn gyk1 yp OLTM An abstract representation of (1): n Σgykidi yk = (2) fenc: D × Gy Y (1) k=1,…p i=1 NOTATION: d=(d1, .. dm)signals from the WTA layer (see previous slide) gy = (gyki) i=1,…n, k=1,…m is the matrix of synaptic gains -- we postulate that this matrix represents output long-term memory (OLTM) y=(y1, .. yp) output vector Slide 13

  14. A neural implementation of a local associative memory (solves problem 1 from slide 5)(WTA.EXE) addressing by content DECODING S21(I,j) S21(i,j) Input long-term memory (ILTM) N1(j) RANDOM CHOICE Output long-term memory (OLTM) ENCODING retrieval Slide 14

  15. A functional model of the previous network[7],[8],[11] (WTA.EXE) (1) (2) (3) (4) (5) Slide 15

  16. Representation of local associative memory in terms of three “one-step” procedures: DECODING, CHOICE, ENCODING Slide 17

  17. HOW CAN WE SOLVE THE HARD PROBLEM 2 from slide 6? Slide 18

More Related