slide1
Download
Skip this Video
Download Presentation
INTRODUCTION

Loading in 2 Seconds...

play fullscreen
1 / 1

INTRODUCTION - PowerPoint PPT Presentation


  • 105 Views
  • Uploaded on

From/To LTM. Symbol neurons. “A”. “N”. LTM (“ARRAY”). “R”. “Y”. P. LTM. W PN. PN. STM. E. x. c. i. t. a. t. i. o. n. l. i. n. k. W in. 10. I. n. h. i. b. i. t. i. o. n. l. i. n. k. 5. SN. Store. 5. R. Information. Short-term Memory. Long-term

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' INTRODUCTION' - maik


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
slide1

From/To LTM

Symbol neurons

“A”

“N”

LTM (“ARRAY”)

“R”

“Y”

P

LTM

WPN

PN

STM

E

x

c

i

t

a

t

i

o

n

l

i

n

k

Win

10

I

n

h

i

b

i

t

i

o

n

l

i

n

k

5

SN

Store

5

R

Information

Short-term

Memory

Long-term

Memory

A

R

Y

Retrieve

Link strength 3

10

9

Link strength 2

4

Link strength 1

4

Storage neurons

Fig. 3 LTM cell

9

8

3

LTM

LTM

LTM

Playback

3

P

Level h

neurons

WTA

P

8

7

STM

2

2

LTM

LTM

LTM

LTM

P

Level h-1

P

7

6

R

STM

1

W

/

E

1

Sensor input

Fig. 4 Hierarchical LTM

6

A

R

Y

N

Read

pointer

Write/erase

pointer

Fig. 6 STM architecture

W

/

E

LTM (“ARRAY”)

From/to

STM

Layer 6

next level

Signal flow

PN

2/3

Lateral association

Layer 2

Feedback from higher level

4

5

Lateral inhibition

6

Layer 6

From/to

STM

From/to

STM

From/to

STM

Input activation

“A”

“Y”

“R”

Fig. 5 LTM cell with minicolumns

Hierarchical spatio-temporal memory for machine learning

based on laminar minicolumn structure

Janusz A. Starzyk, Yinyin Liu

Ohio University, Athens, OH

LONG-TERM MEMORY

SHORT-TERM MEMORY

INTRODUCTION

  • LTM cell:
  • One long-term memory (LTM) cell stores one particular sequence whose length determines the number of required PNs (Fig.3). Cells can be combined into hierarchy (Fig. 4).
  • Symbol neurons (SN) excite primary neurons (PN) through Win.
  • (1)
  • PNs are interconnected to induce the temporal association and model dynamics. (2)
  • LTM cell overall output (3)
  • LTM cell learning:
  • Through competition, a winning LTM cell with maximum output signal strength stores the sequence by adjusting the weights.
  • LTM cell recalling:
  • Signal strength on the LTM output neuron represents the match between the sequence stored in this LTM cell and the input sequence.
  • STM cell:
  • Universal playback machine (Fig. 6)
  • Size is limited, like in human STM
  • Storage neurons for writing
  • Playback neurons for reading
  • Write/erase pointers & read pointers
  • Pointers in a closed loop to reused storage
  • Writing to STM cell:
  • A signal from level 6 neuron of active minicolumn activates a column in the STM
  • Storage neurons in the STM cell fire only when get two activations (from inputs & write pointer)
  • Reading from STM cell:
  • Read pointer disinhibits playback neurons
  • Playback neurons fire when activated from storage neurons & the read pointer is not active
  • Spatio-temporal memories are fundamental to self-organization and learning in bio-inspired systems.
  • Short term memory (STM) and long term memory (LTM): two major types of memories in neurobiological research of human brain.
  • They occupy different regions of the human brain, have different structural organization.
  • They interact with each other.
  • Input information go through the STM so that it can be stored in the LTM.
  • Information from LTM is retrieved to STM where it is updated and new associations are created (Fig.1).

Fig. 1 Interaction between

LTM and STM

  • The layered uniform structure of identical processing units, postulated by Mountcastle as a minicolumn organization [1][3], supports the biological intelligence building in human neocortex.
  • Neurons on different layers of minicolumns are proposed to have specific function in the interaction between STM and LTM.
  • When retrieving information from LTM to STM, particular layer of neurons receives stimulation from LTM.
  • When storing information from STM to LTM, stimulations from STM activate the minicolumns corresponding to the elements of a sequence.
  • The activation from STM or LTM is differentiated from the real environment input by different level of the signal strength.
  • LTM based on minicolumns:
  • Sequence is from the real input:
  • Minicolumns representing symbols “A” “R” “Y” are found through competition.
  • Signal flow (Fig. 5):
  • input  layer 6/4 of winning columns 
  • layer 2 of winning columns  PNs  LTM cell output
  • Strongly stimulates the PNs in LTM cell
  • Strong activations of layer 2 neurons of winning minicolumns help their layer 6 neurons win in local competition  PNs are connected with layer 6 using Hebbian learning.
  • The output of LTM cell enters the layer 6 neuron on the higher LTM level so that “ARRAY” can be combined with other possible sequences to build complex sequence memory.
  • Sequence is from STM:
  • Signal flow (Fig. 5):
  • input layer 6 of winning columns 
  • PNs  LTM cell output
  • STM stimulation will not flow up the minicolumn and overlap with the real sensory input
  • Slightly stimulates the PNs in LTM cell.
  • By comparing the level of stimulation, LTM is able to differentiate the recalled information from the real sensory input.

MINICOLUMN STRUCTURE

CONCLUSIONS

  • In this work, laminar minicolumn structure with multiple layers of neurons, proposed and studied in visual cortex by Grossberg [4] (Fig.2), is used to implement the fundamental learning mechanism of spatio-temporal memory. It has several characteristics:
  • Lateral inhibition among layer 4 neurons
  • selective circuit between 6/4 layers

 perceptual grouping

local winners’ domination

  • Feedback from layer 2 to layer 6
  • Folded feedback 2 6 4
  • Feedback from higher-level layer 6

 solve ambiguity

input selection

  • Feedback will not propagate forward

 network stability

  • In this work, the laminar minicolumn is used in building the proposed structure of STM and LTM. STM is built as a playback machine which stores and recalls a certain sequence without making any associations. The sequential LTM built in a minicolumn can store and recall the sequence by associating symbols and it is able to differentiate the real environment input from the recalled information.
  • The proposed memory models have efficient and stable operation, are biologically plausible and have a number of desired properties for building self-organizing, hierarchical hardware structures.

BIBLIOGRAPHY

[1].Edward G. Jones, Microcolumns in the Cerebral Cortex, Proc. of National Academy of Science of United States of America, vol. 97(10), 2000, pp. 5019-5021

[2].Mountcastle, V. B., Response Properties of Neurons of Cat’s Somatic Sensory Cortex to Peripheral Stimuli, J. Neurophysiol, vol. 20, 1957, pp. 374-407

[3].S. Grossberg, How does the cerebral cortex work? Learning, attention, and grouping by the laminar circuits of visual cortex, Technical report CAS/CNS-97-023, 1998.

Fig.2 Laminar Minicolumn

ad