tutorial plasticity revisited motivating new algorithms based on recent neuroscience research n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Tutorial: Plasticity Revisited - Motivating New Algorithms Based On Recent Neuroscience Research PowerPoint Presentation
Download Presentation
Tutorial: Plasticity Revisited - Motivating New Algorithms Based On Recent Neuroscience Research

Loading in 2 Seconds...

play fullscreen
1 / 84

Tutorial: Plasticity Revisited - Motivating New Algorithms Based On Recent Neuroscience Research - PowerPoint PPT Presentation


  • 71 Views
  • Uploaded on

Tutorial: Plasticity Revisited - Motivating New Algorithms Based On Recent Neuroscience Research. Tsvi Achler MD/PhD. Approximate Outline and References for Tutorial. Department of Computer Science University of Illinois at Urbana-Champaign, Urbana, IL 61801, U.S.A. Intrinsic. Plasticity:.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Tutorial: Plasticity Revisited - Motivating New Algorithms Based On Recent Neuroscience Research' - keren


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
tutorial plasticity revisited motivating new algorithms based on recent neuroscience research
Tutorial: Plasticity Revisited -Motivating New Algorithms Based On Recent Neuroscience Research

TsviAchlerMD/PhD

Approximate Outline and

References for Tutorial

Department of Computer Science

University of Illinois at Urbana-Champaign, Urbana, IL 61801, U.S.A.

outline

Intrinsic

Plasticity:

Synaptic

Outline

Homeostatic

‘Systems’

  • Plasticity is observed in many forms. We review experiments and controversies.
    • Intrinsic ‘membrane plasticity’
    • Synaptic
    • Homeostatic ‘feedback plasticity’
    • System: in combination membrane and feedback can imitate synaptic
  • What does this mean for NN algorithms?
outline nn algorithms

Synaptic Plasticity

Algorithms:

Lateral Inhibition

Outline: NN Algorithms

Feedback Inhibition

Common computational Issues

  • Explosion in connectivity
  • Explosion in training
  • How can nature solve these problems with the plasticity mechanisms outlined?
intrinsic membrane plasticity

Intrinsic

Intrinsic ‘Membrane’ Plasticity
  • Ion channels responsible for activity, spikes
  • ‘Plastic’ ion channels found in membrane
  • Voltage sensitive channel types:
    • (Ca++, Na+, K+)
  • Plasticity independent of synapse plasticity

Review:

G. Daoudal, D, Debanne, Long-Term Plasticity of Intrinsic Excitability: Learning Rules and Mechanisms, Learn. Mem. 2003 10: 456-465

synaptic plasticity hypothesis

Synaptic

Synaptic Plasticity Hypothesis
  • Bulk of studies
  • Synapse changes with activation
  • Motivated by Hebb 1949
  • Supported by Long Term Potentiation / Depression (LTP/LTD) experiments

Review:

Malenka, R. C. and M. F. Bear (2004). "LTP and LTD: an embarrassment of riches." Neuron 44(1): 5-21.

ltp ltd experiment protocol

Synaptic

LTP/LTD Experiment Protocol

Pre-synaptic electrode

50%

?

A50

A50

  • Establish ‘pre-synaptic’ cell
  • Establish ‘post-synaptic’ cell
  • Raise pre-synaptic activity to amplitude to A50 where post-synaptic cell fires “50%”
  • Induction: high frequency high voltage spike train on both pre & post electrodes
  • Plasticity: any changes when A50 is applied

Post-synaptic electrode

Brain

plasticity change in post with a 50

Synaptic

Plasticity: change in post with A50
  • LTP : increased activity with A50
  • LTD : decreased activity with A50
  • Can last minutes hours days
    • Limited by how long recording is viable
strongest evidence

Synaptic

Strongest Evidence
  • Systems w/minimal feedback:
  • Motor, Musculature & tetanic stimulation
  • Sensory/muscle junction of Aplesia Gill Siphon Reflex
  • Early Development: Retina → Ocular Dominance Columns

Variable Evidence: areas with feedback

Cortex, Thalamus, Sensory System, Hippocampus

  • Basic relations between pre-post cells
  • Basic mechanisms of Synaptic Plasticity are

Still controversial

Why so difficult?

Connections change with activity

Strong evidence:

Muscles, early development (occular dominance colums)

Tetanic stimulation

Applesia siphon responses

Supported by Long Term Potentiation (LTP) experiments

variable evidence

Synaptic

Variable Evidence

Cortex, Thalamus, Sensory Systems & Hippocampus

  • Basic mechanisms still controversial

60 years and 13,000 papers in pubmed

  • It is difficult to establish/control when LTP or LTD occurs
slide11

Synaptic

LTP vs LTD Criteria is Variable

  • Pre-Post spike timing: (Bi & Poo 1998; Markram et al. 1997)
    • Pre-synaptic spike before post  LTP
    • Post-synaptic spike before pre  LTD:
  • First spike in burst most important (Froemke & Dan 2002)
  • Last spike most important (Wang et al. 2005)
  • Frequency most important:  Freq  LTP (Sjöström et al. 2001; Tzounopoulos et al. 2004).
  • Spikes are not necessary (Golding et al. 2002; Lisman & Spruston 2005)
  • The criteria to induce LTP or LTD are also a current subject of debate (Froemke et al, 2006). Some studies find that if the presynaptic neuron is activated within tens of milliseconds before the postsynaptic neuron (pre-post), LTP is induced. The reversed order of firing (post-pre) results in LTD (ie Bi & Poo 1998; Markram et al. 1997). In other studies, timing of the first spike (Froemke & Dan 2002) or the last spike (Wang et al. 2005) in each burst is found to be dominant in determining the sign and magnitude of LTP. Yet other studies show that synaptic modification is frequency dependent and that high-frequency bursts of pre- and postsynaptic spikes lead to LTP, regardless of the relative spike timing (Sjöström et al. 2001; Tzounopoulos et al. 2004). Even other studies show that somatic spikes are not even necessary for the induction of LTP and LTD (Golding et al. 2002; Lisman & Spruston 2005).
  • In addition, it is unclear if these mechanisms drive single synapse changes as predicted by synaptic plasticity because physical changes in synapses show variability as well.
  • Synaptic Change
  • Activity dependent changes in synaptic spine morphology has been reported in the hippocampus (see Yuste et al. 2001 for review) including localized changes to single synapses with caged-glutamate sub-spike stimulation (Mazrahi et al., 2004). However changes in synapses can also occur with: estrus cycle (Woolley et al 1990), irradiation (Brizzee 1980), hibernation (Popov et al 1992), exercise (ie. Fordyce & Wehner 1993), epilepsy (Multani et al., 1994) and synaptic blockade using Mg+ (Kirkov & Harris, 1999).
  • Also the synaptic morphology may not coincide with synaptic/behavioral function (Yuste et al. 2001).
  • Furthermore brain regions responsible for recognition processing may display different characteristics. For example synaptic changes with experience in the mouse barrel cortex appear to be more variable than the hippocampus (Trachtenberg et al 2002).
  • The strongest evidence supporting the synaptic plasticity hypothesis has been reported in the gill withdrawal reflex of the marine mollusk aplysia (Antonov, Antonova & Kandel, 2003). However the changes occur between sensory and motor neurons, not between processes responsible for recognizing stimuli. It may be the case that motor learning occurs via synaptic plasticity while recognition processing occurs through recurrent feedback. Furthermore, even if synaptic plasticity is found in the sensory cortex, it may be specific to pre-motor processing. Such pre-motor processes may co-exist with recognition circuits in the same regions
many factors affect ltp ltd

Synaptic

Many factors affect LTP & LTD
  • Voltage sensitive channels ie. NMDA
  • Cell signaling channels ie via Ca++
  • Protein dependent components
  • Fast/slow
  • Synaptic tagging

Review:

Malenka, R. C. and M. F. Bear (2004). "LTP and LTD: an embarrassment of riches." Neuron 44(1): 5-21.

studies of morphology unclear

Synaptic

Studies of Morphology Unclear

Synapse Morphology and density studies:

    • Spine changes ≠ Function changes
  • Many other causes of changes in spines:
    • Estrus, Exercise, Hibernation, Epilepsy, Irradiation

Review:

Yuste, R. and T. Bonhoeffer (2001). "Morphological changes in dendritic spines associated with long-term synaptic plasticity." Annu Rev Neurosci 24: 1071-89.

many components variability

Synaptic

Many Components & Variability
  • Indicates a system is complex
    • involving more than just the recorded pre-synaptic and postsynaptic cells
  • Means NN learning algorithms are difficult to justify
  • But the system regulates itself

Review of LTP & LTD variability:

Froemke, Tsay, Raad, Long, Dan, Yet al. (2006) J Neurophysiol 95(3): 1620-9.

homeostatic plasticity

Homeostatic

Homeostatic Plasticity

Self-Regulating Plasticity

Networks Adapt to:

Channel Blockers

Genetic Expression of Channels

slide16

Adaptation to Blockers

Homeostatic

Post-synaptic electrode

Pre-synaptic electrode

Pre-Synaptic Cell

Post-Synaptic Cell

  • Establish baseline recording
  • Bathe culture in channel blocker (2 types)
    • Either ↑ or ↓ Firing Frequency
  • Observe System changes after ~1 day
  • Washing out blocker causes reverse phenomena

Culture Dish

homeostatic adaptation to blockers
Homeostatic Adaptation to Blockers

Pre-Synaptic Cell

Post-Synaptic Cell

Displays

Feedback

Inhibition

Response

↑ Frequency →

↓ Frequency →

Frequency x Strength = Baseline

→ ↓ Synaptic Strength

→ ↑ Synaptic Strength

Turrigiano & Nelson (2004)

homeostatic adaptation to expression

Homeostatic

Homeostatic Adaptation to Expression

Cell

Channels Involved

1

2

3

Marder & Goaillard (2006)

Cells with different numbers & types of channels show same electrical properties

homeostatic summary

Homeostatic

Homeostatic Summary
  • Adapts networks to a homeostatic baseline
  • Utilizes feedback-inhibition (regulation)
feedback inhibition

Homeostatic

Feedback Inhibition

Pre-Synaptic Cell

Post-Synaptic Cell

Feedback Ubiquitously Throughout Brain

feedback throughout brain

Homeostatic

Feedback Throughout Brain

Thalamus & Cortex

Nice lnk but pictures poor quality

http://images.google.com/imgres?imgurl=http://www.benbest.com/science/anatmind/FigVII6.gif&imgrefurl=http://www.benbest.com/science/anatmind/anatmd7.html&h=320&w=1193&sz=19&tbnid=dhilzMBJy4rbFM:&tbnh=40&tbnw=150&hl=en&start=5&prev=/images%3Fq%3Dthalamus%2Bfeedback%26svnum%3D10%26hl%3Den%26lr%3D%26client%3Dfirefox-a%26rls%3Dorg.mozilla:en-US:official_s%26sa%3DG

http://arken.umb.no/~compneuro/figs/LGN-circuit.jpg

LaBerge, D. (1997) "Attention, Awareness, and the Triangular Circuit". Consciousness and Cognition, 6, 149-181

http://psyche.cs.monash.edu.au/v4/psyche-4-07-laberge.html

feedback and pre synaptic inhibition found in many forms

Figure from Aroniadou-Anderjaska, Zhou, Priest, Ennis & Shipley 2000

Homeostatic

Overwhelming Amount of Feedback Inhibition

Feedback and Pre-Synaptic Inhibition found in Many Forms
  • Feedback loops
  • Tri-synaptic connections
  • Antidromic Activation
  • NO (nitric oxide)
  • Homeostatic Plasticity

Regulatory Mechanisms Suggest

Pre-Synaptic Feedback

Modified from Chen, Xiong & Shepherd (2000).

  • Feedback loops
  • 3 cells
  • Antidromic
  • Adjustment of pre-synaptic processes in Homeostatic Plasticity

Feedback & Pre-Synaptic Inhibition Evidence has Many Forms

summary

Homeostatic

Summary
  • Homeostatic Plasticity requires and maintains Feedback Inhibition
systems plasticity

‘Systems’

‘Systems’ Plasticity

Feedback Inhibition combined with Intrinsic Plasticity

Can be Indistinguishable from Synaptic Plasticity

many cells are always present in plasticity experiments

‘Systems’

Many cells are always present in plasticity experiments
  • Pre & Post synaptic cells are never in isolation
  • Studies:
  • In Vivo
  • Brain slices
  • Cultures: only viable with 1000’s of cells

Post-synaptic electrode

Pre-synaptic electrode

Culture Dish

Changes in neuron resting activity is tolerated

feedback inhibition network

∆↓

∆↓

∆↓

∆↑

∆↓

∆↓

‘Systems’

Feedback Inhibition Network

Then learning is

induced artificially

by activating both

neurons together

Increase pre-synaptic cell activity until

Induction can affect all connected post-synaptic cells

Immeasurable changes of all connected neurons

Only the two recorded cells and the synapse between them are considered

With Pre-Synaptic Inhibition

recorded postsynaptic cell fires 50%

Pre-synaptic cells connect to many post-synaptic cells

but this is rarely considered

Causes big change in the recorded neuron

LTP protocol: find pre-synaptic and post-synaptic cells

simulation up to 26 cell interaction

All Neurons 0.01

Baseline

‘Systems’

Simulation: Up to 26 Cell Interaction

Immeasurable changes of all connected neurons

LTP

LTD

1

Normalized Activity Scale (0-1)

0.9

Resting ∆ Value

0.8

0.7

0.6

Causes big change in the recorded neuron

0.5

0.4

LTP = bias recorded cell

0.3

Lternatively LTD = negatively bias recorded cell

0.2

0.1

0

significance

‘Systems’

Significance

Experiments can not distinguish between synaptic plasticity and feedback inhibition

  • Membrane voltage Vm allowed Δ ~6mV
  • 0.01 = ~∆Vm of 0.3 mV
  • Thus not likely to see membrane affects
  • Presynaptic cells connect to >> 26 cells
    • Effect much more pronounced in real networks
regulatory feedback plasticity

‘Systems’

Regulatory Feedback Plasticity
  • Feedback Inhibition + Intrinsic Plasticity are indistinguishable in current experiments from Synaptic Plasticity theory
  • Why have ‘apparent’ synaptic plasticity?
  • Feedback Inhibition is important for processing simultaneous patterns
2 algorithms

Synaptic Plasticity

Lateral Inhibition

2. Algorithms

Feedback Inhibition

slide32

Y2

Y2

I2

I1

I3

Challenges In Neural Network Understanding

Regulatory Feedback

Limited Cognitive Intuition

Large Network Problems

lw13

Y1

Y3

Y4

Y1

Lateral Connections:

connectivity explosion

Y2

Y3

lw12

lw23

w31

w21

w33

w32

w43

w22

w12

I4

w23

w11

w13

w42

w41

x4

x1

x2

x4

x3

x1

x2

x3

Input Feedback

Neural Networks

Weights

w22

Strong evidence of feedback Replace with binary bidirectional connections.

Y1

Y3

Y4

w23

Could feedback dynamics be necessary vital? Strong evidence of feedback Replace with binary bidirectional connections.

w34

w31

w21

w24

w44

w32

w22

w12

w43

w33

w23

w11

w13

w42

w14

w41

x4

x1

x2

x3

lateral connectivity

Y1

Y2

Y3

Lateral Connectivity

x2

x3

x4

x1

Millions of representations possible

-> a connection required to logically relate between representations

Every representation can not be connected to all others in the brain

Combinatorial Explosion in Connectivity

Can lead to an implausible number of connections and variables

Symbolic Logic based on direct connections

Combinatorial Explosion in Connectivity

What would a weight variable between them mean?

0.8

?

slide34

Y2

Y2

I2

I1

I3

Challenges In Neural Network Understanding

Regulatory Feedback

Large Network Problems

lw13

Y1

Y3

Y4

Y1

Lateral Connections:

connectivity explosion

Y2

Y3

lw12

lw23

w31

w31

w21

w21

Weights:

combinatorial training

w33

w33

w32

w32

w43

w43

w22

w22

w12

w12

I4

w23

w23

w11

w11

w13

w13

w42

w42

w41

w41

x4

x1

x2

x4

x3

x1

x2

x3

Input Feedback

Neural Networks

Weights

w22

Strong evidence of feedback Replace with binary bidirectional connections.

Y1

Y3

Y4

w23

Could feedback dynamics be necessary vital? Strong evidence of feedback Replace with binary bidirectional connections.

w34

w31

w21

w24

w44

w32

w22

w12

w43

w33

w23

w11

w13

w42

w14

w41

x4

x1

x2

x3

superposition catastrophe

Y1

Y2

Y3

Weights: Training Difficulty Given Simultaneous Patterns

Superposition Catastrophe

x4

x3

x1

x2

  • Teach A B C … Z separately
  • Test multiple simultaneous letters

A D

B

D

A B

E

A

C

G E

w31

w21

Not Taught with simultaneous patterns:

Will not recognize simultaneous patterns

Teaching simultaneous patterns is a combinatorial problem

w33

w32

w43

w12

w22

w23

w11

w13

w42

w41

slide36

Y1

Y2

Y3

Weights: Training Difficulty Given Simultaneous Patterns

x4

x3

x1

x2

  • Teach A B C … Z separately
  • Test multiple simultaneous letters

A D

G E

w31

w21

w33

w32

w43

w12

w22

‘Superposition Catastrophe’ (Rosenblatt 1962)

Can try to avoid by this segmenting each pattern individually but it often requires recognition or not possible

w23

w11

w13

w42

w41

composites common
Composites Common
  • Natural Scenarios (cluttered rainforest)
  • Scenes
  • Noisy ‘Cocktail Party’ Conversations
  • Odorant or Taste Mixes

Segmentation not trivial

Segmentation is not possible in most modalities

(requires recognition?)

slide38

Y1

Y2

Y3

Segmenting Composites

x4

x3

x1

x2

New Scenario:

Learn:

If can’t segment image

must interpret composite

Building

1

0

0

1

1

0

0

1

0

1

0

1

0

1

0

1

1

1

0

2

Feature Space

Feature Space

Letters learned individually

w31

w21

Chick

w33

w32

w43

w12

w22

+

=

w23

w11

w13

w42

w41

A B

C D

Chick & Frog

Simultaneously

B

Frog

A

0

1

0

0

1

Simultaneous

2

4

0

1

2

B

1

1

0

0

0

B

1

1

0

0

0

C

0

1

0

1

1

Feature Space

Recognition Given Simultaneous:

A, Bx2, and C

+

Train

Test

slide39

Y2

Y2

I2

I1

I3

Challenges In Neural Network Understanding

Regulatory Feedback

Large Network Problems

Y1

Y3

Y4

Y1

Lateral Connections:

connectivity explosion

Y2

Y3

w31

w21

Weights:

combinatorial training

w33

w32

w43

w22

w12

I4

w23

w11

w13

w42

w41

x4

x1

x2

x4

x3

x1

x2

x3

Feedback Inhibition:

avoids combinatorial issues

interprets composites

Input Feedback

Neural Networks

Weights

w22

Strong evidence of feedback Replace with binary bidirectional connections.

Y1

Y3

Y4

w23

Could feedback dynamics be necessary vital? Strong evidence of feedback Replace with binary bidirectional connections.

w34

w31

w21

w24

w44

w32

w22

w12

w43

w33

w23

w11

w13

w42

w14

w41

x4

x1

x2

x3

feedback inhibition1

Self-Regulatory Feedback

Feedback Inhibition

Control Theory

Perspective

Feedback Inhibition

Every output inhibits only its own inputs

  • Gain control mech for each input
  • Massive feedback to inputs
  • Iteratively evaluates input use
  • Avoids optimized weight parameters
  • Training establishes binary relationships
  • Testing iteratively evaluates input use

Output

Input

Neuroscience

Perspective

ya

yb

Output

Network

I2

I1

Input

x1

x2

x1

x2

slide41

Equations Used

Feedback Inhibition

X

b

Î

j

ya

yb

Y

Xb Raw Input Activity

a

X

b

I2

I1

I

b

x1

x2

x1

x2

Q

b

Q shunting inhibition.

Qb shunting inhibition at input b.

C collection of all output cells

Ca cell “a”.

Na the set of input connections to cell Ca.

na the number of processes in set Na of cell Ca.

P primary inputs (not affected by shunting inhibition).

I collection of all inputs

Ibinput cell “b”.

Mb the set of recurrent feedback connections to input Ib.

mb the number of connections in set Mb

slide42

Equations

Feedback Inhibition

X

=

b

I

b

Q

b

Î

j

ya

yb

Y

Xb Raw Input Activity

Ib Input after feedback

Qb Feedback

a

X

b

I2

I1

I

b

x1

x2

x1

x2

Q

b

Q shunting inhibition.

Qb shunting inhibition at input b.

C collection of all output cells

Ca cell “a”.

Na the set of input connections to cell Ca.

na the number of processes in set Na of cell Ca.

P primary inputs (not affected by shunting inhibition).

I collection of all inputs

Ibinput cell “b”.

Mb the set of recurrent feedback connections to input Ib.

mb the number of connections in set Mb

slide43

Equations

Feedback Inhibition

Y

(

t

)

å

+

=

Output

D

a

Y

(

t

t

)

I

a

i

n

Î

a

i

Y

a

X

=

b

I

Inhibition

b

Q

b

å

=

Q

Y

(

t

)

Feedback

b

j

Î

X

j

Î

j

b

ya

yb

Y

Ya Output Activity

Xb Raw Input Activity

Ib Input after feedback

Qb Feedback

na # connections of Ya

a

X

W

b

I2

Q2=yb

I1

=

Q1=ya+yb

=

I

b

x1

x2

x1

x2

Q

b

Q shunting inhibition.

Qb shunting inhibition at input b.

C collection of all output cells

Ca cell “a”.

Na the set of input connections to cell Ca.

na the number of processes in set Na of cell Ca.

P primary inputs (not affected by shunting inhibition).

I collection of all inputs

Ibinput cell “b”.

Mb the set of recurrent feedback connections to input Ib.

mb the number of connections in set Mb

slide44

Equations

Feedback Inhibition

Y

(

t

)

å

+

=

Output

D

a

Y

(

t

t

)

I

a

i

n

Î

a

i

Y

a

X

=

b

I

Inhibition

b

Q

b

å

=

Q

Y

(

t

)

Feedback

b

j

Î

X

j

Î

j

b

ya

yb

No Oscillations

No Chaos

Repeat

I2

x2

I1

=

x1

=

Q2=yb

Q1=ya+yb

x1

x2

C collection of all output cells

Ca cell “a”.

Na the set of input connections to cell Ca.

na the number of processes in set Na of cell Ca.

P primary inputs (not affected by shunting inhibition).

I collection of all inputs

Ibinput cell “b”.

Mb the set of recurrent feedback connections to input Ib.

mb the number of connections in set Mb

Q shunting inhibition.

Qb shunting inhibition at input b.

slide45

Y2

Y2

I2

I2

I1

I1

I3

I3

Feedback Inhibition

Simple Connectivity

Y1

Y3

Y4

Output Nodes

W

I4

Input Nodes

x4

x1

x2

x3

Source of Training Problems

All links have same strength

New node only connects to its inputs

Source of Connectivity Problems

Inputs have positive real values indicating intensity

allows modular combinations

Y2

I2

I1

Feedback Inhibition

Allows Modular Combinations

‘P’

‘R’

Outputs

Y1

1

0

1

1

Features

Features

I1

Inputs

interprets composite patterns

Inputs

Steady State:

(C1, C2)

(PA, PB)

Inputs

(PA, PB)

Results

(C1, C2)

(½, ½)

(x1 ≥ x2) (x1–x2, x2)

(x1≤ x2) (0, (x1+x2)/2)

(1, ¼)

(⅓, 1)

(0, ½)

(¾, ¼)

(0, ⅔)

Algorithm

Interprets Composite Patterns

Steady State

Network Configuration

Inputs

x1 , x2

Outputs

y1 , y2

y1

y2

( - )

Outputs

1 , 0

1 , 0

‘P’

1 , 1

0 , 1

‘R’

x1

x2

0 , 2

2 , 2

2Rs

Supports

Non-Binary Inputs

Behaves as if there is an inhibitory connection

2 , 1

1 , 1

P&R

yet there is no direct connection between x2 & y1

Inputs simultaneously supporting both outputs

A=1 and B = ½ (

(1, ¼) (¾, ¼)

(⅓, 1) (0, ⅔)

=1,

=½)

iterative evaluation

1

1

å

å

=

=

1

1

wxy

N

N

x

  • Forward Connections:

How it Works

Feedback Inhibition Algorithm

Iterative Evaluation

Outputs

Y1

Y2

I2

I1

Inputs

x1

x2

1

thus Wy=

slide49

How it Works

Feedback Inhibition Algorithm

Back

Outputs

Y1

Y2

I2

I1

Inputs

x1

x2

forward

How it Works

Feedback Inhibition Algorithm

Forward

Outputs

Y1

Y2

I2

I1

Inputs

x1

x2

slide51

How it Works

Feedback Inhibition Algorithm

Back

Outputs

Y1

Y2

I2

I1

Inputs

x1

x2

slide52

How it Works

Feedback Inhibition Algorithm

Outputs

Y1

Y2

Active (1)

1

1

Features

Inactive (0)

I2

I1

Inputs

=

slide53

How it Works

Feedback Inhibition Algorithm

Initially both outputs become active

Outputs

C2

Active (1)

Inactive (0)

I2

I1

Inputs

slide54

How it Works

Feedback Inhibition Algorithm

I1 gets twice as much inhibition as I2

Outputs

C2

Active (1)

Inactive (0)

I2

Inputs

slide55

How it Works

Feedback Inhibition Algorithm

I1 gets twice as much inhibition as I2

Outputs

C2

Active (1)

Inactive (0)

I2

Inputs

slide56

How it Works

Feedback Inhibition Algorithm

Outputs

Active (1)

Inactive (0)

Inputs

slide57

How it Works

Feedback Inhibition Algorithm

This affects Y1 more than Y2

Outputs

Active (1)

Inactive (0)

Inputs

slide58

How it Works

Feedback Inhibition Algorithm

This separation continues iteratively

Outputs

Active (1)

Inactive (0)

I2

Inputs

slide59

How it Works

Feedback Inhibition Algorithm

This separation continues iteratively

Outputs

Active (1)

Inactive (0)

Inputs

slide60

How it Works

Steady State

Until the most encompassing representation predominates

Graph of Dynamics

1

Y1

Y2

‘R’

Y1

Outputs

Y2

Activity

1

1

1

1

1

0

Features

0

I2

0

1

2

3

4

5

I1

Inputs

Simulation Time (T)

=

demonstration appling learned information to new scenarios

Demonstration

Demonstration: Appling Learned Information to New Scenarios
  • Nonlinear: mathematical analysis difficult
    • demonstrated via examples
  • Teach patterns separately
  • Test novel pattern combinations
  • Requires decomposition of composite
  • Letter patterns are used for intuition
superposition catastrophe1

Demonstration

Teach single patterns only

Superposition Catastrophe

B

D

Appling Learned Information to New Scenarios

E

A

  • Learn A B C … Z separately

C

A

0

1

0

0

1

Simultaneous

2

4

0

1

2

B

1

1

0

0

0

B

1

1

0

0

0

C

0

1

0

1

1

Nodes

Recognition Given Simultaneous:

A, Bx2, and C

A

0

1

0

0

1

A

0

1

0

0

1

1

1

0

2

1

1

0

2

1

1

0

0

0

+

…….

Features

Modular

Combination

C

0

1

0

1

1

D

1

0

1

0

1

B

1

1

0

0

0

E

1

1

0

1

1

Train

Test

+

=

Feature Space

…….

26

Nodes

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Feature Space

this defines network

Demonstration

This Defines Network

Nothing is changed or re-learned further

  • Comparison networks are trained & tested with the same patterns
    • Neural Networks (NN)*
    • Representing synaptic plasticity
    • Lateral Inhibition
      • (Winner-take-all with ranking of winners)
  • * Waikato Environment for Knowledge Analysis (WEKA) repository tool for most recent and best algorithms
tests increasingly complex

Demonstration

Tests: Increasingly Complex
  • 26 patterns presented one at a time
    • All methods recognize 100%
  • Choose 2 letters, present simultaneously
    • Either: union logical-‘or’ features
    • add features
  • Choose 4 letters, present simultaneously
    • Either: add or ‘or’ features
    • Include repeats in add case (ie ‘A+A+A+A’)

A

0

1

0

0

1

325 Combinations

A+B

1

2

0

0

1

A|B

1

1

0

0

1

B

1

1

0

0

0

To

Networks

=

or

+

14,950 Combinations

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

456,976Combinations

two patterns simultaneously

A B

Two Patterns Simultaneously

100

Synaptic Plasticity

  • Train 26 nodes
  • Test w/2 patterns
  • Do 2 top nodes
  • match?

i.e.

(A B C D)

90

Lateral Inhibition

80

Feedback Inhibition

70

60

50

% of combinations

40

325

Combinations

30

20

10

0

0/2

1/2

2/2

Letters Correctly Classified

%combinations = number of x correct matches

number of combinations

Figure 5: NN with two letter retraining

superposition catastrophe2

Demonstration

Simultaneous Patterns

Superposition Catastrophe

Four pattern union

A D

A

0

1

0

0

1

Simultaneous

2

4

0

1

2

B

1

1

0

0

0

C E

B

1

1

0

0

0

C

0

1

0

1

1

Recognition Given Simultaneous:

A, Bx2, and C

A

0

1

0

0

1

A

0

1

0

0

1

1

1

0

2

1

1

0

2

1

1

0

0

0

+

A|C|D|E

1

1

1

1

1

C

0

1

0

1

1

D

1

0

1

01

E

1

1

0

1

1

Train

Test

+

=

Feature Space

To

Network

or

or

or

=

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Not Taught with simultaneous patterns:

Will not recognize simultaneous patterns

Teaching simultaneous patterns is a combinatorial problem

Feature Space

union of four patterns

A B

C D

Union of Four Patterns :

100

Synaptic Plasticity

  • Same 26 nodes
  • Test w/4 patterns
  • Do 4 top nodes
  • match?

i.e.

(A B C D)

90

Lateral Inhibition

80

Feedback Inhibition

70

60

50

% of combinations

40

14,950

Combinations

30

20

10

0

0/4

1/4

2/4

3/4

4/4

Letters Correctly Classified

%combinations = number of x correct matches

number of combinations

Figure 5: NN with two letter retraining

union of five patterns

A B

C D E

Union of Five Patterns:

100

Synaptic Plasticity

  • Same 26 nodes
  • Test w/5 patterns
  • Do 5 top nodes
  • match?

i.e.

(A B C D)

90

Lateral Inhibition

80

Feedback Inhibition

70

60

50

% of combinations

65,780

Combinations

40

30

20

10

0

0/5

1/5

2/5

3/5

4/5

5/5

Letters Correctly Classified

%combinations = number of x correct matches

number of combinations

Figure 5: NN with two letter retraining

superposition catastrophe3

Demonstration

Pattern Addition

Superposition Catastrophe

Improves feedback inhibition performance further

A D

A

0

1

0

0

1

Simultaneous

2

4

0

1

2

B

1

1

0

0

0

C E

B

1

1

0

0

0

C

0

1

0

1

1

Recognition Given Simultaneous:

A, Bx2, and C

A

0

1

0

0

1

A

0

1

0

0

1

1

1

0

0

0

1

1

0

2

1

1

0

2

+

A+C+D+E

2

3

1

2

4

C

0

1

0

1

1

D

1

0

1

0

1

E

1

1

0

1

1

Train

Test

+

=

Feature Space

To

Network

+

+

+

=

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Not Taught with simultaneous patterns:

Will not recognize simultaneous patterns

Teaching simultaneous patterns is a combinatorial problem

Feature Space

slide70

A B

C D

K S

A V

X C

O M

Addition of Four Patterns :

100

Synaptic Plasticity

  • Same 26 nodes
  • Test w/4 patterns
  • Do 4 top nodes
  • match?

i.e.

(A B C D)

90

Lateral Inhibition

80

Pre-Synaptic Inhibition

70

60

50

% of combinations

40

14,950

Combinations

30

20

10

0

0/4

1/4

2/4

3/4

4/4

Letters Correctly Classified

%combinations = number of x correct matches

number of combinations

Figure 5: NN with two letter retraining

addition of eight patterns

A G B L

C D X E

Addition of Eight Patterns:

100

Synaptic Plasticity

  • Same 26 nodes
  • Test w/8 patterns
  • Do 8 top nodes
  • match?

i.e.

(A B C D)

90

Lateral Inhibition

80

Feedback Inhibition

70

60

50

% of combinations

1,562,275

Combinations

40

30

20

10

0

0/8

1/8

2/8

3/8

4/8

5/8

6/8

7/8

8/8

Letters Correctly Classified

%combinations = number of x correct matches

number of combinations

Figure 5: NN with two letter retraining

superposition catastrophe4

Demonstration

Superposition Catastrophe

With Addition Feedback Algorithm Can Count

  • Repeated patterns reflected by

value of corresponding nodes

A B

B C

A

0

1

0

0

1

Simultaneous

2

4

0

1

2

B

1

1

0

0

0

B

1

1

0

0

0

C

0

1

0

1

1

A

0

1

0

0

1

A

0

1

0

0

1

1

1

0

2

1

1

0

2

1

1

0

0

0

+

Nodes:

A=1

B=2

C=1

D→Z=0

A+B+B+C

2

4

0

1

2

C

0

1

0

1

1

B

1

1

0

0

0

B

1

1

0

0

0

Train

Test

+

=

Feature Space

+

+

+

=

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

100%

456,976Combinations

Feature Space

tested on random patterns

Demonstration

Tested on Random Patterns
  • 50 randomly generated patterns
  • From 512 features
  • 4 presented at a time
  • 6,250,000 combinations (including repeats)
  • 100% correct including count

Computer starts getting slow

what if conventional algorithms are trained for this task

Insight

What if Conventional Algorithms are Trained for this Task?

A+C+D+E

2

3

1

2

4

A+B

1

2

0

0

1

This vector

is ‘A’ ‘C’

‘D’ & ‘E’

together

This vector

is ‘A’ & ‘B’

together

slide75

Insight

Y1

Training is not practical

Y2

Y3

x4

x3

x1

x2

  • Teach pairs: 325 combinations

26 letters

K S

A V

M

P L

A E

A C

A D

A B

w31

w21

  • Furthermore ABCD can be misinterpreted as AB & CD, or ABC & D
  • Teach triples: 2600 combinations
  • Quadruples: 14,950.
  • Training complexity increases combinatorialy

w33

w32

w43

w12

w22

w23

w11

w13

w42

w41

slide76

Insight

Y1

Training Difficulty Given

Simultaneous Patterns

Y2

Y3

x4

x3

x1

x2

Feedback inhibition inference

seems to avoid this problem

A D

G E

w31

w21

w33

w32

w43

w12

w22

Known as: ‘Superposition Catastrophe’

(Rosenblatt 1962; Rachkovskij & Kussul 2001)

w23

w11

w13

w42

w41

binding problem simultaneous representations
Binding problem Simultaneous Representations

Chunking features:

Computer Algorithms

similar problems with simpler representations

Computer Algorithms have similar problems with much simpler representations

resolving pattern interactions

y2

y3

x1

x2

x3

x2

Simultaneous Representations Cause The Binding Problem

Resolving Pattern Interactions

‘Barbell’

‘Car Chassis’

‘Wheels’

y1

Outputs

Inputs

x1

unless the network is explicitly trained otherwise.

all are patterns matched

Given:

AvA

However it is a binding error to call this a barbell.

slide79

Binding Comparison

1

Synaptic Plasticity

0.8

Lateral Inhibition

Feedback Inhibition

0.6

Vector Activity

0.4

y1

0.2

y2

y3

0

y1

y2

y3

‘Wheels’

‘Barbell’

‘Car Chassis’

x3

x1

x2

Conventional: Requires training data to predict binding combinations

resolving pattern interactions1

Inputs

Binding: Network-Wide Solution

Resolving Pattern Interactions

y1

y2

y3

Outputs

x1

x2

x3

Outputs

Inputs

y1, y2, y3

x1, x2, x3

1, 0, 0

1, 0, 0

Wheels

0, 1, 0

1, 1, 0

Barbell

1, 0, 1

1, 1, 1

Car Barbell

Most precise output configuration

network under dynamic control
Network Under Dynamic Control

Recognition inseparable from attention

Feedback: an automatic way to access inputs

‘Symbolic’ control via bias

resolving pattern interactions2

Inputs

Symbolic Effect of Bias

Resolving Pattern Interactions

y1

y2

y3

Outputs

x1

x2

x3

Outputs

Inputs

y1, y2, y3

x1, x2, x3

0.02, 0.98, 0.71

1, 1, 1

Barbell

Bias y2 = 0.15

Is barbell present?

Interested in y2: Bias y2 = 0.15

Most precise output configuration

summary1
Summary
  • Feedback inhibition combined with intrinsic plasticity generates a ‘systems’ plasticity that looks like synaptic plasticity
  • Feedback inhibition gives algorithms more flexibility with simultaneous patterns
  • Brain processing and learning is still unclear: likely a paradigm shift is needed
acknowledgements
Acknowledgements

Eyal Amir

Cyrus Omar, Dervis Vural, Vivek Srikumar

Intelligence Community PostdocProgram & National Geospatial-Intelligence Agency

HM1582-06--BAA-0001