1 / 57

Simulation-based language understanding

Cafe. Simulation-based language understanding. Utterance. “Harry walked to the cafe.”. Constructions. Analysis Process. General Knowledge. Simulation Specification. Schema Trajector Goal walk Harry cafe. Belief State. Simulation. Simulation specification.

anevay
Download Presentation

Simulation-based language understanding

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Cafe Simulation-based language understanding Utterance “Harry walked to the cafe.” Constructions Analysis Process General Knowledge Simulation Specification Schema Trajector Goal walk Harry cafe Belief State Simulation

  2. Simulation specification • The analysis process produces a simulation specification that • includes image-schematic, motor control and conceptual structures • provides parameters for a mental simulation

  3. NTL Manifesto • Basic Concepts are Grounded in Experience • Sensory, Motor, Emotional, Social, • Abstract and Technical Concepts map by Metaphor to more Basic Concepts • Neural Computation models all levels

  4. Analyzer: Discourse & Situational Context Simulation based Language Understanding Constructions Utterance incremental, competition-based, psycholinguistically plausible Semantic Specification: image schemas, frames, action schemas Simulation

  5. Embodied Construction Grammar • Embodied representations • active perceptual and motor schemas (image schemas, x-schemas, frames, etc.) • situational and discourse context • Construction Grammar • Linguistic units relate form and meaning/function. • Both constituency and (lexical) dependencies allowed. • Constraint-based • based on feature unification (as in LFG, HPSG) • Diverse factors can flexibly interact.

  6. Embodied Construction GrammarECG(Formalizing Cognitive Linguisitcs) • Linguistic Analysis • Computational Implementation • Test Grammars • Applied Projects – Question Answering • Map to Connectionist Models, Brain • Models of Grammar Acquisition

  7. ECG Structures • Schemas • image schemas, force-dynamic schemas, executing schemas, frames… • Constructions • lexical, grammatical, morphological, gestural… • Maps • metaphor, metonymy, mental space maps… • Situations (Mental Spaces) • discourse, hypothetical, counterfactual…

  8. Embodied schemas schema name schemaSource-Path-Goal roles source path goal trajector schemaContainer roles interior exterior portal boundary role name Boundary Interior Trajector Portal Source Goal Path Exterior These are abstractions over sensorimotor experiences.

  9. schema <name> subcase of <schema> evokes <schema> as <local name> roles < local role >: <role restriction> constraints <role> ↔ <role> <role>  <value> <predicate> schema Hypotenuse subcase of Line-Segment evokes Right-Tri as rt roles {lower-left: Point} {upper-right: Point} constraints self ↔ rt.long-side ECG Schemas

  10. schema SPG subcase of TrajLandmark roles source: Place path: Directed–Curve goal: Place {trajector: Entity} {landmark: Bounded- Region} schema Container roles interior: Bounded-Region boundary: Curve portal: Bounded-Region Source-Path-Goal; Container

  11. schema RD roles category gender count specificty resolved Ref modifications schema RD5 // Eve roles HumanSchema Female one Known Eve Sweetser none Referent Descriptor Schemas

  12. construction <name> subcaseof <construction> constituents <name>:<construction> form constraints <name> before/meets <name> meaning: constraints // same as for schemas construction SpatialPP constituents prep: SpatialPreposition lm: NP form constraints prep meets lm meaning: TrajectorLandmark constraints selfm ↔ prep landmark ↔ lm.category ECG Constructions

  13. construction Into subcase of SpatialPreposition form: WordForm constraints orth  "into" meaning: SPG evokes Container as c constraints landmark ↔ c goal ↔ c.interior construction The subcase of Determiner form:WordForm constraints orth  "the" meaning evokes RD as rd constraints rd.specificity  “known” Into and The CXNs

  14. construction DetNoun subcase of NP constituents d:Determiner n:Noun formconstraints d before n meaning constraints selfm ↔ d.rd category ↔ n construction NPVP subcase of S constituents subj: NP vp: VP formconstraints subj before vp meaningconstraints profiled-participant ↔ subj Two Grammatical CXNs

  15. Simulation specification • The analysis process produces a simulation specification that • includes image-schematic, motor control and conceptual structures • provides parameters for a mental simulation

  16. Competition-based analyzer An analysis is made up of: A constructional tree A semantic specification A set of resolutions A-GIVE-B-X subj obj2 v obj1 Ref-Exp Give Ref-Exp Ref-Exp @Book @Man Give-Action @Woman Bill Mary book01 recipient giver theme Johno Bryant Bill gave Mary the book

  17. Combined score determines best-fit Syntactic Fit: Constituency relations Combine with preferences on non-local elements Conditioned on syntactic context Antecedent Fit: Ability to find referents in the context Conditioned on syntax match, feature agreement Semantic Fit: Semantic bindings for frame roles Frame roles’ fillers are scored

  18. Constructs -------------- NPVP[0] (0,5) Eve[3] (0,1) ActiveSelfMotionPath [2] (1,5) WalkedVerb[57] (1,2) SpatialPP[56] (2,5) Into[174] (2,3) DetNoun[173] (3,5) The[204] (3,4) House[205] (4,5) Schema Instances ------------------- SelfMotionPathEvent[1] HouseSchema[66] WalkAction[60] Person[4] SPG[58] RD[177] ~ house RD[5]~ Eve 0Eve1walked2into3the4house5

  19. SelfMotionPathEvent[1].mover SPG[58].trajector WalkAction[60].walker RD[5].resolved-ref RD[5].category Filler: Person4 SpatialPP[56].m Into[174].m SelfMotionPathEvent[1].spg Filler: SPG58 SelfMotionPathEvent[1] .landmark House[205].m RD[177].category SPG[58].landmark Filler:HouseSchema66 WalkedVerb[57].m WalkAction[60].routine WalkAction[60].gait SelfMotionPathEvent[1] .motion Filler:WalkAction60 Unification chains and their fillers

  20. Summary: ECG • Linguistic constructions are tied to a model of simulated action and perception • Embedded in a theory of language processing • Constrains theory to be usable • Basis for models of grammar learning • Precise, computationally usable formalism • Practical computational applications, like MT and NLU • Testing of functionality, e.g. language learning • A shared theory and formalism for different cognitive mechanisms • Constructions, metaphor, mental spaces, etc. • Reduction to Connectionist and Neural levels

  21. Productive Argument Omission (Mandarin)Johno Bryant & Eva Mok 1 • Mother (I) give you this (a toy). 2 • You give auntie[the peach]. 3 • Oh (go on)! You give[auntie] [that]. 4 • [I]give[you] [some peach]. CHILDES Beijing Corpus (Tardiff, 1993; Tardiff, 1996)

  22. Arguments are omitted with different probabilities All args omitted: 30.6% No args omitted: 6.1%

  23. Analyzing ni3 gei3 yi2 (You give auntie) Two of the competing analyses: • Syntactic Fit: • P(Theme omitted | ditransitive cxn) = 0.65 • P(Recipient omitted | ditransitive cxn) = 0.42 (1-0.78)*(1-0.42)*0.65 = 0.08 (1-0.78)*(1-0.65)*0.42 = 0.03

  24. Using frame and lexical information to restrict type of reference

  25. Discourse & Situational Context • child mother • peach auntie • table Can the omitted argument be recovered from context? • Antecedent Fit: ?

  26. How good of a theme is a peach? How about an aunt? • Semantic Fit:

  27. The argument omission patterns shown earlier can be covered with just ONE construction • Each construction is annotated with probabilities of omission • Language-specific default probability can be set P(omitted|cxn): 0.78 0.42 0.65

  28. Leverage process to simplify representation • The processing model is complementary to the theory of grammar • By using a competition-based analysis process, we can: • Find the best-fit analysis with respect to constituency structure, context, and semantics • Eliminate the need to enumerate allowable patterns of argument omission in grammar • This is currently being applied in models of language understanding and grammar learning.

  29. Modeling context for language understanding and learning • Linguistic structure reflects experiential structure • Discourse participants and entities • Embodied schemas: • action, perception, emotion, attention, perspective • Semantic and pragmatic relations: • spatial, social, ontological, causal • ‘Contextual bootstrapping’ for grammar learning

  30. Discourse: Discourse01 participants: Eve , Mother objects: Hands, ... discourse-history: DS01 situational-history: Wash-Action Discourse & Situational Context The context model tracks accessible entities, events, and utterances

  31. Hands category: BodyPart part-of: Eve number: plural accessibility: accessible Eve category: child gender: female name: Eve age: 2 Wash-Action washer: Eve washee: Hands DS01 speaker: Mother addressee: Eve attentional-focus: Hands content: {"are they clean yet?"} speech-act: question Mother category: parent gender: female name: Eve age: 33 Each of the items in the context model has rich internal structure Discourse: Participants: Objects: Situational History: Discourse History:

  32. Analysis Discourse & Situational Context Analysis produces a semantic specification World Knowledge Linguistic Knowledge Utterance “You washed them” Semantic Specification WASH-ACTION washer: Eve washee: Hands

  33. Gold’s Theorem: No superfinite class of language is identifiable in the limit from positive data only Principles & Parameters Babies are born as blank slates but acquire language quickly (with noisy input and little correction) → Language must be innate: Universal Grammar + parameter setting But babies aren’t born as blank slates! And they do not learn language in a vacuum! How Can Children Be So Good At Learning Language?

  34. Embodied Construction Grammar Opulence of the Substrate Prelinguistic children already have rich sensorimotor representations and sophisticated social knowledge Basic Scenes Simple clause constructions are associated directly with scenes basic to human experience (Goldberg 1995, Slobin 1985) Verb Island Hypothesis Children learn their earliest constructions (arguments, syntactic marking) on a verb-specific basis (Verb Island Hypothesis, Tomasello 1992) Key ideas for a NT of language acquisitionNancy Chang and Eva Mok

  35. Embodiment and Grammar Learning Paradigm problem for Nature vs. Nurture The poverty of the stimulus The opulence of the substrate Intricate interplay of genetic and environmental, including social, factors.

  36. Computational models Grammatical induction language identification context-free grammars, unification grammars statistical NLP (parsing, etc.) Word learning models semantic representations logical forms discrete representations continuous representations statistical models Developmental evidence Prior knowledge primitive concepts event-based knowledge social cognition lexical items Data-driven learning basic scenes lexically specific patterns usage-based learning Two perspectives on grammar learning

  37. Significant prior conceptual/embodied knowledge rich sensorimotor/social substrate Incremental learning based on experience Lexically specific constructions are learned first. Language learning tied to language use Acquisition interacts with comprehension, production; reflects communication and experience in world. Statistical properties of data affect learning Key assumptions for language acquisition

  38. Context you Addressee Eve addressee washer washer Discourse Segment washed Wash-Action Wash-Action washee washee attentional-focus Hands them ContextElement Analysis draws on constructions and context Form Meaning before before

  39. World Knowledge Utterance Linguistic Knowledge Analysis Discourse & Situational Context PartialSemSpec Learning updates linguistic knowledge based on input utterances Learning

  40. Context you Addressee Eve addressee washer washer Discourse Segment washed Wash-Action Wash-Action washee washee attentional-focus Hands them ContextElement Context aids understanding: Incomplete grammars yield partial SemSpec Form Meaning

  41. Context you Addressee Eve addressee washer Discourse Segment washed Wash-Action Wash-Action washee attentional-focus Hands them ContextElement Context bootstraps learning: new construction maps form to meaning Form Meaning before washer washee before

  42. you Addressee washed Wash-Action them ContextElement Context bootstraps learning: new construction maps form to meaning Form Meaning YOU-WASHED-THEM constituents: YOU, WASHED, THEM form: YOU before WASHED WASHED before THEM meaning: WASH-ACTION washer: addressee washee: ContextElement before washer washee before

  43. World Knowledge Utterance • reorganize • merge • join • split reinforcement Analysis • hypothesize • map form to meaning • learn contextual constraints Discourse & Situational Context PartialSemSpec Grammar learning: suggesting new CxNs and reorganizing existing ones Linguistic Knowledge

  44. Challenge: How far up to generalize Inanimate Object Manipulable Objects Unmovable Objects Food Furniture Fruit Savory Chair Sofa apple watermelon rice Eat rice Eat apple Eat watermelon Want rice Want apple Want chair

  45. Challenge: Omissible constituents In Mandarin, almost anything available in context can be omitted – and often is in child-directed speech. Intuition: Same context, two expressions that differ by one constituent  a general construction with the constituent being omissible May require verbatim memory traces of utterances + “relevant” context

  46. When does the learning stop? Bayesian Learning Framework Schemas + Constructions reorganize Analysis + Resolution reinforcement Context Fitting hypothesize SemSpec Most likely grammar given utterances and context The grammar prior includes a preference for the “kind” of grammar In practice, take the log and minimize cost  Minimum Description Length (MDL)

  47. Intuition for MDL Suppose that the prior is inversely proportional to the size of the grammar (e.g. number of rules) It’s not worthwhile to make this generalization S -> Give me NP NP -> the book NP -> a book S -> Give me NP NP -> DET book DET -> the DET -> a 51

  48. Intuition for MDL S -> Give me NP NP -> the book NP -> a book NP -> the pen NP -> a pen NP -> the pencil NP -> a pencil NP -> the marker NP -> a marker S -> Give me NP NP -> DET N DET -> the DET -> a N -> book N -> pen N -> pencil N -> marker

  49. world knowledge utterance comm. intent constructicon reinforcement (usage) reinforcement (usage) analyze & resolve generate hypothesize constructions & reorganize discourse & situational context analysis utterance simulation response reinforcement(correction) reinformcent (correction) Usage-based learning: comprehension and production

More Related