1 / 27

The Emergentist Approach To Language As Embodied in Connectionist Networks

The Emergentist Approach To Language As Embodied in Connectionist Networks. James L. McClelland Stanford University. The man liked the book. The boy loved the sun. The woman hated the rain. Some Simple Sentences. Sentences Clauses and phrases Words Morphemes Phonemes. S -> NP VP

Download Presentation

The Emergentist Approach To Language As Embodied in Connectionist Networks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Emergentist Approach To LanguageAs Embodied in Connectionist Networks James L. McClelland Stanford University

  2. The man liked the book. The boy loved the sun. The woman hated the rain. Some Simple Sentences

  3. Sentences Clauses and phrases Words Morphemes Phonemes S -> NP VP VP-> Tense V (NP) Tense V -> V+{past} V -> like, love, hate… N -> man, boy, sun… man -> /m/ /ae/ /n/ {past} -> ed ed -> /t/ or /d/ or /^d/‡ ‡depends on context The Standard Approach:Units and Rules

  4. What happens with exceptions?

  5. Standard Approach to the Past Tense • We form the past tense by using a (simple) rule. • If an item is an exception, the rule is blocked. • So we say ‘took’ instead of ‘taked’ • If you’ve never seen an item before, you use the rule • If an item is an exception, but you forget the exceptional past tense, you apply the rule • Predictions: • Regular inflection of ‘nonce forms’ • This man is blinging. Yesterday he … • This girl is tupping. Yesterday she … • Over-regularization errors: • Goed, taked, bringed

  6. The Emergentist Approach • Language (like perception, etc) arises from the interactions of neurons, each of which operates according to a common set of simple principles of processing, representation and learning. • Units and rules are useful to approximately describe what emerges from these interactions but have no mechanistic or explanatory role in language processing, language change, or language learning.

  7. An Emergentist Theory:Natural Selection • No grand design • Organisms produce offspring with random differences (mating helps with this) • Forces of nature favor those best suited to survive • Survivors leave more offspring, so their traits are passed on • The full range of the animal kingdom including all the capabilities of the human mind emerge from these very basic principles

  8. An Emergentist/Connectionist Approach to the Past Tense • Knowledge is in connections • Experience causes connections to change • Sensitivity to regularities emerges

  9. The RM Model • Learns from verb [root, past tense] pairs • [Like, liked]; [love, loved]; [carry, carried]; [take, took] • Present and past are represented as patterns of activation over units that stand for phonological features.

  10. Examples of ‘wickelfeatures’ in the verb ‘baked’ • Starts with a nasal followed by a vowel • Has a long vowel preceded by a nasal and followed by a stop • Ends with a dental stop preceded by a velar stop • Ends with an unvoiced sound preceded by another unvoiced sound

  11. Summed input A Pattern Associator Network Pattern representing sound of the verb’s past tense Matrix of connections p(a=1) Pattern representing the sound of the verb root

  12. Learning rule for the Pattern Associator network • For each output unit: • Determine activity of the unit based on its input. • If the unit is active when target is not: • Reduce each weight coming into the unit from each active input unit. • If the unit is inactive when the target is active: • Increase the weight coming into the unit from each active input unit. • Each connection weight adjustment is very small • Learning is gradual and cumulative

  13. Some Learning Experiments • Learn a single item • Test for generalization • Learn from a set of regular items • Test for generalization

  14. Over-regularization errors in the RM network Most frequent past tenses in English: • Felt • Had • Made • Got • Gave • Took • Came • Went • Looked • Needed Here’s where 400 more words were introduced Trained with top ten words only.

  15. Over-regularization simulation in the 78 net • First learn one exception • Then continue training the exception with all other forms • What happens to the exception?

  16. Questions?

  17. Regulars co-exist with exceptions. The model produces the regular past for most unfamiliar test items. The model captures the different subtypes among the regulars: like-liked love-loved hate-hated The model is sensitive to the no-change pattern among irregulars: hit-hit cut-cut hide-hid Some features of the model

  18. The model exploits gangs of related exceptions. dig-dug cling-clung swing-swung The ‘regular pattern’ infuses exceptions as well as regulars: say-said, do-did have-had keep-kept, sleep-slept Burn-burnt Teach-taught Additional characteristics

  19. Key Features of the Past Tense model • No lexical entries and no rules • No problem of rule induction or grammar selection

  20. Strengths and Weaknesses of the Models

  21. Elman’s Simple Recurrent Network • Task is to predict the next element of a sequence on the output, given the current element on the input units. • Each element is represented by a pattern of activation. • Each box represents a set of units. • Each dotted arrow represents all-to-all connections. • The solid arrow indicates that the previous pattern on the hidden units is copied back to provide context for the next prediction. • Learning occurs through connection weight adjustment using an extended version of the error correcting learning rule.

  22. Results for Elman net trained with letter sequences

  23. Hidden Unit Patterns for Elman Net Trained on Word Sequences

  24. Key Features of the Both Models • No lexical entries and no rules • No problem of rule induction or grammar selection

More Related