1 / 27

NTL – Converging Constraints

NTL – Converging Constraints. Basic concepts and words derive their meaning from embodied experience. Abstract and theoretical concepts derive their meaning from metaphorical maps to more basic embodied concepts. Structured Connectionist Models can capture both of these processes nicely.

masato
Download Presentation

NTL – Converging Constraints

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NTL – Converging Constraints • Basic concepts and words derive their meaning from embodied experience. • Abstract and theoretical concepts derive their meaning from metaphorical maps to more basic embodied concepts. • Structured Connectionist Models can capture both of these processes nicely. • Grammar extends this by Constructions: pairings of form with embodied meaning.

  2. Cafe Simulation-based language understanding Utterance “Harry walked to the cafe.” Constructions Analysis Process General Knowledge Simulation Specification Schema Trajector Goal walk Harry cafe Belief State Simulation

  3. Background: Primate Motor Control • Relevant requirements (Stromberg, Latash, Kandel, Arbib, Jeannerod, Rizzolatti) • Should model coordinated, distributed, parameterizedcontrol programs required for motor action and perception. • Should be an active structure. • Should be able to model concurrent actions and interrupts. • Model • The NTL project has developed a computational model based on that satisfies these requirements (x- schemas). • Details, papers, etc. can be obtained on the web at http://www.icsi.berkeley.edu/NTL

  4. walker at goal energy walker=Harry goal=home Active representations • Many inferences about actions derive from what we know about executing them • Representation based on stochastic Petri nets captures dynamic, parameterized nature of actions • Walking: • bound to a specific walker with a direction or goal • consumes resources (e.g., energy) • may have termination condition(e.g., walker at goal) • ongoing, iterative action

  5. Somatotopy of Action Observation Foot Action Hand Action Mouth Action Buccino et al. Eur J Neurosci 2001

  6. Active Motion Model Evolving Responses of Competing Models over Time. Nigel Goddard 1989

  7. Language Development in Children • 0-3 mo: prefers sounds in native language • 3-6 mo: imitation of vowel sounds only • 6-8 mo: babbling in consonant-vowel segments • 8-10 mo: word comprehension, starts to lose sensitivity to consonants outside native language • 12-13 mo: word production (naming) • 16-20 mo: word combinations, relational words (verbs, adj.) • 24-36 mo: grammaticization, inflectional morphology • 3 years – adulthood: vocab. growth, sentence-level grammar for discourse purposes

  8. food toys misc. people sound emotion action prep. demon. social Words learned by most 2-year olds in a play school (Bloom 1993)

  9. Learning Spatial Relation Words Terry Regier A model of children learning spatial relations. Assumes child hears one word label of scene. Program learns well enough to label novel scenes correctly. Extended to simple motion scenarios, like INTO. System works across languages. Mechanisms are neurally plausible.

  10. Learning System We’ll look at the details next lecture dynamic relations (e.g. into) structured connectionistnetwork (based on visual system)

  11. Limitations • Scale • Uniqueness/Plausibility • Grammar • Abstract Concepts • Inference • Representation • Biological Realism

  12. Constrained Best Fit in Nature inanimate animate

  13. Learning Verb MeaningsDavid Bailey A model of children learning their first verbs. Assumes parent labels child’s actions. Child knows parameters of action, associates with word Program learns well enough to: 1) Label novel actions correctly 2) Obey commands using new words (simulation) System works across languages Mechanisms are neurally plausible.

  14. Motor Control (X-schema) for SLIDE

  15. Parameters for the SLIDE X-schema

  16. System Overview

  17. Learning Two Senses of PUSH Model merging based on Bayesian MDL

  18. Training ResultsDavid Bailey English • 165 Training Examples, 18 verbs • Learns optimal number of word senses (21) • 32 Test examples : 78% recognition, 81% action • All mistakes were close lift ~ yank, etc. • Learned some particle CXN,e.g., pull up Farsi • With identical settings, learned senses not in English

  19. Constrained Best Fit in Nature inanimate animate

More Related