Memory l.jpg
This presentation is the property of its rightful owner.
Sponsored Links
1 / 70

Memory PowerPoint PPT Presentation


  • 167 Views
  • Updated On :
  • Presentation posted in: General

Memory. Hopfield Network. Content addressable Attractor network. Hopfield Network. Hopfield Network. General Case: Lyapunov function. Neurophysiology. Mean Field Approximation. Null Cline Analysis. What are the fixed points?. E. I. C I. C E. Null Cline Analysis.

Download Presentation

Memory

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Memory l.jpg

Memory


Hopfield network l.jpg

Hopfield Network

  • Content addressable

  • Attractor network


Hopfield network3 l.jpg

Hopfield Network


Hopfield network4 l.jpg

Hopfield Network

  • General Case:

  • Lyapunov function


Neurophysiology l.jpg

Neurophysiology


Mean field approximation l.jpg

Mean Field Approximation


Null cline analysis l.jpg

Null Cline Analysis

  • What are the fixed points?

E

I

CI

CE


Null cline analysis8 l.jpg

Null Cline Analysis

  • What are the fixed points?


Null cline analysis9 l.jpg

I

I

E

Null Cline Analysis

Unstable fixed point

E

Stable fixed point


Null cline analysis10 l.jpg

I

Null Cline Analysis

E

E


Null cline analysis11 l.jpg

Null Cline Analysis

I

E

E


Null cline analysis12 l.jpg

Null Cline Analysis

I

E

E


Null cline analysis13 l.jpg

Null Cline Analysis

I

E

E


Null cline analysis14 l.jpg

Null Cline Analysis

Stable branches

I

Unstable branch

E

E


Null cline analysis15 l.jpg

Null Cline Analysis

I

E

E


Null cline analysis16 l.jpg

Null Cline Analysis

I

Stable fixed point

I

E


Null cline analysis17 l.jpg

E

Null Cline Analysis

I

I

E


Null cline analysis18 l.jpg

E

Null Cline Analysis

I

I

E


Null cline analysis19 l.jpg

Null Cline Analysis

Inhibitory null cline

I

Excitatory null cline

E

Fixed points


Binary memory l.jpg

E

I

CI

CE

Binary Memory

I

E


Binary memory21 l.jpg

E

I

CI

CE

Binary Memory

Storing

I

Decrease inhibition (CI)

E


Binary memory22 l.jpg

E

I

CI

CE

Binary Memory

Storing

I

Back to rest

E


Binary memory23 l.jpg

E

I

CI

CE

Binary Memory

Reset

I

Increase inhibition

E


Binary memory24 l.jpg

E

I

CI

CE

Binary Memory

Reset

I

Back to rest

E


Networks of spiking neurons l.jpg

Networks of Spiking Neurons

  • Problems with the previous approach:

    • Spiking neurons have monotonic I-f curves (which saturate, but only at very high firing rates)

    • How do you store more than one memory?

    • What is the role of spontaneous activity?


Networks of spiking neurons26 l.jpg

Networks of Spiking Neurons


Networks of spiking neurons27 l.jpg

Networks of Spiking Neurons

Ij

R(Ij)


Networks of spiking neurons28 l.jpg

Networks of Spiking Neurons


Networks of spiking neurons29 l.jpg

Networks of Spiking Neurons

  • A memory network must be able to store a value in the absence of any input:


Networks of spiking neurons30 l.jpg

Networks of Spiking Neurons


Networks of spiking neurons31 l.jpg

Networks of Spiking Neurons

cR(Ii)

Ii

Iaff


Networks of spiking neurons32 l.jpg

Networks of Spiking Neurons

  • With a non saturating activation function and no inhibition, the neurons must be spontaneously active for the network to admit a non zero stable state:

cR(Ii)

I2*

Ii


Networks of spiking neurons33 l.jpg

Networks of Spiking Neurons

  • To get several stable fixed points, we need inhibition:

Unstable fixed point

Stable fixed points

I2*

Ii


Networks of spiking neurons34 l.jpg

Networks of Spiking Neurons

  • Clamping the input: inhibitory Iaff

Ii

Iaff


Networks of spiking neurons35 l.jpg

Networks of Spiking Neurons

  • Clamping the input: excitatory Iaff

cR(Ii)

Ii

I2*

Iaff


Networks of spiking neurons36 l.jpg

Networks of Spiking Neurons

Ij

R(Ij)


Networks of spiking neurons37 l.jpg

Networks of Spiking Neurons

  • Major Problem: the memory state has a high firing rate and the resting state is at zero. In reality, there is spontaneous activity at 0-10Hz and the memory state is around 10-20Hz (not 100Hz)

  • Solution: you don’t want to know (but it involves a careful balance of excitation and inhibition)…


Line attractor networks l.jpg

Line Attractor Networks

  • Continuous attractor: line attractor or N-dimensional attractor

  • Useful for storing analog values

  • Unfortunately, it’s virtually impossible to get a neuron to store a value proportional to its activity


Line attractor networks39 l.jpg

Line Attractor Networks

  • Storing analog values: difficult with this scheme….

cR(Ii)

Ii


Line attractor networks40 l.jpg

Line Attractor Networks

Implication for transmitting rate and integration…

cR(Ii)

Ii


Line attractor networks41 l.jpg

Line Attractor Networks

  • Head direction cells

DH

100

80

60

Activity

40

20

0

-100

0

100

Preferred Head Direction (deg)


Line attractor networks42 l.jpg

Line Attractor Networks

  • Attractor network with population code

  • Translation invariant weights

DH

100

80

60

Activity

40

20

0

-100

0

100

Preferred Head Direction (deg)


Line attractor networks43 l.jpg

Line Attractor Networks

  • Computing the weights:


Line attractor networks44 l.jpg

Line Attractor Networks

  • The problem with the previous approach is that the weights tend to oscillate. Instead, we minimize:

  • The solution is:


Line attractor networks45 l.jpg

Line Attractor Networks

  • Updating of memory: bias in the weights, integrator of velocity…etc.


Line attractor networks46 l.jpg

Line Attractor Networks

  • How do we know that the fixed points are stable? With symmetric weights, the network has a Lyapunov function (Cohen, Grossberg 1982):


Line attractor networks47 l.jpg

Line Attractor Networks

  • Line attractor: the set of stable points forms a line in activity space.

  • Limitations: Requires symmetric weights…

  • Neutrally stable along the attractor: unavoidable drift


Memorized saccades l.jpg

Memorized Saccades

+

+

T1

T2


Memorized saccades49 l.jpg

Memorized Saccades

+

+

R1

R2

T1

T2

S1

R2

S2

S1=R1 S2=R2-S1


Memorized saccades50 l.jpg

Memorized Saccades

+

+

R1

R2

T1

T2

S1

S2

S1

T1

T2

S2


Memorized saccades51 l.jpg

Memorized Saccades

+

+

R1

R2

T1

T2

S1

S2

T1

T2

S1

S2


Memorized saccades52 l.jpg

A

B

-DE

Activity

Activity

Vertical Ret. Pos. (deg)

Vertical Ret. Pos. (deg)

Horizontal Ret. Pos. (deg)

Horizontal Ret. Pos. (deg)

Memorized Saccades


Neural integrator l.jpg

Neural Integrator

  • Oculomotor theory

  • Evidence integrator for decision making

  • Transmitting rates in multilayer networks

  • Maximum likelihood estimator


Semantic memory l.jpg

Semantic Memory

  • Memory of words is sensitive to semantic (not just spelling)

  • Experiment: Subjects are first trained to remember a list of words. A few hours later, they are presented with a list of words and they have to pick the ones they were supposed to remember. Many mistakes involve words semantically related to the remembered words.


Semantic memory55 l.jpg

Semantic Memory

  • Usual solution: semantic networks (nodes: words, links: semantic similarities) and spreading activation

  • Problem 1: The same word can have several meanings (e.g. bank). This is not captured by semantic network

  • Problem 2: some interaction between words are negative, even when they have no semantic relationship (e.g. doctor and hockey).


Semantic memory56 l.jpg

Semantic Memory

  • Usual solution: semantic networks (nodes: words, links: semantic similarities) and spreading activation


Semantic memory57 l.jpg

Semantic Memory

  • Bayesian approach (Griffiths, Steyvers, Tenenbaum, Psych Rev 06)

  • Documents are bags of words (we ignore word ordering).

  • Generative model for document. Each document has a gist which is a mixture of topics. A topic in turn defines a probability distribution over words.


Semantic memory58 l.jpg

Semantic Memory

  • Bayesian approach

  • Generative model for document

g

z

w

Gist

Topics

words


Semantic memory59 l.jpg

Semantic Memory

  • z = Topics = finance, english country side… etc.

  • Gist: mixture of topics. P(z|g) mixing proportions.

  • Some documents might be 0.9 finance, 0.1 english country side (e.g. wheat market).

    P(z=finance|g1)=0.9, P(z=engl country|g1)=0.1

  • Other might be 0.2 finance, 0.8 english country side (e.g. Lloyds CEO buys a mansion)

    P(z=finance|g1)=0.2, P(z=engl country|g1)=0.8


Semantic memory60 l.jpg

Semantic Memory

  • Bayesian approach

  • Generative model for document

g

z

w

Gist

Topics

words


Semantic memory61 l.jpg

Semantic Memory

  • Topic (z1)=finance

  • Words: P(w|z1)

  • 0.01 bank, 0.008 money, 0.0 meadow…

  • Topic (z2)=english country side

  • Words: P(w|z2)

  • 0.001 bank, 0.001 money, 0.002 meadow…


Slide62 l.jpg

  • The gist is shared within a document but the topics can be varied from one sentence (or even word) to the next.


Semantic memory63 l.jpg

Semantic Memory

  • Problem: we only observe the words, not the topic of the gist…

  • How do we know how many topics and how many gists to pick to account for a corpus of words, and how do we estimate their probabilities?

  • To pick the number of topics and gist: Chinese restaurant process, Dirichlet process and hierarchical Dirichlet process. MCMC sampling.

  • Use techniques like EM to learn the probability for the latent variables (topics and gists).

  • However, a human is still needed to label the topics…


Semantic memory64 l.jpg

Semantic Memory

Words in

Topic 1

Words in

Topic 3

Words in

Topic 2


Semantic memory65 l.jpg

Semantic Memory

  • Bayesian approach

  • Generative model for document

g

z

w

Gist

Topics

words


Semantic memory66 l.jpg

Semantic Memory

  • Problems we may want to solve

  • Prediction P(wn+1|w).What’s the next word?

  • DisambiguationP(z|w). What are the mixture of topics in this document?

  • Gist extractionP(g|w). What’s the probability distribution over gists?


Semantic memory67 l.jpg

Semantic Memory

  • What we need is a representation of P(w,z,g)


Semantic memory68 l.jpg

g

z

w

Gist

Topics

words

Semantic Memory

  • P(w,z,g) is given by the generative model.


Semantic memory69 l.jpg

Semantic Memory

  • Explain semantic interferences in list

  • will tend to favor words that are semantically related through the topics and gists.

  • Capture the fact that a given word can have different meanings (topics and gists) depending on the context.


Slide70 l.jpg

Countryside

Word being observed

Finance

Predicted next word

Money less likely to be seen if the topic is country side


  • Login