Simple learning in connectionist networks. Learning associations in connectionist networks Associationism (James, 1890) Association by contiguity Generalization by similarity. Connectionist implementation
Associationism (James, 1890)
Association by contiguity
Generalization by similarity
Represent items as patterns of activity, where similarity is measured in terms of overlap/correlation of patterns
Represent contiguity of items as simultaneous presence of their patterns over two groups of units (A and B)
Adjust weights on connections between units in A and units in B so that the
pattern on A tends to cause the corresponding pattern on B
As a result, when we next present the same or similar pattern on A, it tends to produce the corresponding pattern on B (perhaps somewhat weakened or distorted)
What Hebb actually said:
When an axon of cell A is near enough to excite a cell B and repeatedly and consistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A’s efficacy, as one of the cells firing B, is increased.
The minimal version of the Hebb rule:
When there is a synapse between cell A and cell B, increment the strength of the synapse whenever A and B fire together (or in close succession).
The minimal Hebb rule as implemented in a network:
ends up being proportional to their correlation
Face perception units
Not Brad Pitt or Postle
With Hebbian learning, many different patterns can be stored in a single configuration of weights.