1 / 58

A Best-Fit Approach for Productive Analysis of Omitted Arguments

A Best-Fit Approach for Productive Analysis of Omitted Arguments. Eva Mok & John Bryant University of California, Berkeley International Computer Science Institute. Simplify grammar by exploiting the language understanding process. Omission of arguments in Mandarin Chinese

sydney
Download Presentation

A Best-Fit Approach for Productive Analysis of Omitted Arguments

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Best-Fit Approach for Productive Analysis of Omitted Arguments Eva Mok & John Bryant University of California, Berkeley International Computer Science Institute

  2. Simplify grammar by exploiting the language understanding process Omission of arguments in Mandarin Chinese Construction grammar framework Model of language understanding Our best-fit approach

  3. Productive Argument Omission (in Mandarin) 1 • Mother (I) give you this (a toy). 2 • You give auntie[the peach]. 3 • Oh (go on)! You give[auntie] [that]. 4 • [I]give[you] [some peach]. CHILDES Beijing Corpus (Tardiff, 1993; Tardiff, 1996)

  4. Arguments are omitted with different probabilities All arguments omitted: 30.6% No arguments omitted: 6.1%

  5. Construction grammar approach Kay & Fillmore 1999; Goldberg 1995 Grammaticality: form and function Basic unit of analysis: construction, i.e. a pairing of form and meaning constraints Not purely lexically compositional Implies early use of semantics in processing Embodied Construction Grammar (ECG) (Bergen & Chang, 2005)

  6. Problem: Proliferation of constructions

  7. If the analysis process is smart, then... The grammar needs only state one construction Omission of constituents is flexibly allowed The analysis process figures out what was omitted

  8. Best-fit analysis process takes burden off the grammar representation Analyzer: Discourse & Situational Context Constructions Utterance incremental, competition-based, psycholinguistically plausible Semantic Specification: image schemas, frames, action schemas Simulation

  9. Competition-based analyzer finds the best analysis An analysis is made up of: A constructional tree A set of resolutions A semantic specification The best fit has the highest combined score

  10. Combined score that determines best-fit Syntactic Fit: Constituency relations Combine with preferences on non-local elements Conditioned on syntactic context Antecedent Fit: Ability to find referents in the context Conditioned on syntactic information, feature agreement Semantic Fit: Semantic bindings for frame roles Frame roles’ fillers are scored

  11. Analyzing ni3 gei3 yi2 (You give auntie) Syntactic Fit: P(Theme omitted | ditransitive cxn) = 0.65 P(Recipient omitted | ditransitive cxn) = 0.42 Two of the competing analyses: (1-0.78)*(1-0.42)*0.65 = 0.08 (1-0.78)*(1-0.65)*0.42 = 0.03

  12. Using frame and lexical information to restrict type of reference

  13. Can the omitted argument be recovered from context? Antecedent Fit: • Discourse & Situational Context • child mother • peach auntie • table ?

  14. How good of a theme is a peach? How about an aunt? • Semantic Fit:

  15. The argument omission patterns shown earlier can be covered with just ONE construction Each cxn is annotated with probabilities of omission Language-specific default probability can be set P(omitted|cxn): 0.78 0.42 0.65

  16. Research goal Language Data • A computationally-precise modeling framework for learning early constructions Learning New Construction Linguistic Knowledge

  17. Frequent argument omission in pro-drop languages • Mandarin example: ni3 gei3 yi2 (“you give auntie”) • Even in English, there are often no spoken antecedents to pronouns in conversations Learner must integrate cues from intentions, gestures, prior discourse, etc

  18. A short dialogue • bie2 mo3 wai4+tou2 a: #1_3 ! (別抹外頭啊) • NEG-IMP apply forehead • Don’t apply [lotion to your] forehead • mo3 wai4+tou2 ke3 jiu4 bu4 hao3+kan4 le a: . (抹外頭可就不好看了啊) • apply forehead LINKER LINKER NEG good looking CRS SFP • [If you] apply [lotion to your] forehead then [you will] not be pretty … • ze ya a: # bie2 gei3 ma1+ma wang3 lian3 shang4 mo:3 e: ! (嘖呀啊 # 別給媽媽往臉上抹呃) • INTERJ # NEG-IMP BEN mother CV-DIR face on apply • INTERJ # Don’t apply [the lotion] on [your mom’s] face (for mom) • [- low pitch motherese] ma1+ma bu4 mo:3 you:2 . (媽媽不抹油) • mother NEG apply lotion • Mom doesn’t apply (use) lotion

  19. Goals, refined • Demonstrate learning given • embodied meaning representation • structured representation of context • Based on • Usage-based learning • Domain-general statistical learning mechanism • Generalization / linguistic category formation

  20. Towards a precise computational model • Modeling early grammar learning • Context model & Simulation • Data annotation • Finding the best analysis for learning • Hypothesizing and reorganizing constructions • Pilot results

  21. Embodied Construction Grammar construction yi2-N subcaseof Morpheme form constraints self.f.orth <-- "yi2" meaning : @Aunt evokes RD as rd constraints self.m <--> rd.referent self.m <--> rd.ontological_category

  22. “you” specifies discourse role construction ni3-N subcase of Morpheme form constraints self.f.orth <-- "ni3" meaning : @Human evokes RD as rd constraints self.m <--> rd.referent self.m <--> rd.ontological_category rd.discourse_participant_role <-- @Addressee rd.set_size <-- @Singleton

  23. The meaning of “give” is a schema with roles schema Transfer subcaseof Action roles giver : @Entity recipient : @Entity theme : @Entity constraints giver <--> protagonist construction gei3-V2 subcaseof Morpheme form constraints self.f.orth <-- "gei3" meaning : Give schema Give subcaseof Transfer constraints inherent_aspect <-- @Inherent_Achievement giver <-- @Animate recipient <-- @Animate theme <-- @Manipulable_Inanimate_Object

  24. Finally, you-give-aunt links up the roles construction ni3-gei3-yi2 subcaseof Finite_Clause constructional constituents n : ni3-N g : gei3-V2 y : yi2-N form constraints n.f meets g.f g.f meets y.f meaning : Give constraints self.m <--> g.m self.m.giver <--> n.m self.m.recipient <--> y.m

  25. Discourse & Situational Context The learning loop: Hypothesize & Reorganize World Knowledge Utterance Linguistic Knowledge reorganize Analysis reinforcement Context Fitting hypothesize PartialSemSpec

  26. ni3 Addressee gei3 Give yi2 Aunt If the learner has a ditransitive cxn Form Meaning Context XIXI MOT addressee speaker giver meets Discourse Segment meets recipient INV theme omitted Peach

  27. ni3 Addressee gei3 Give yi2 Aunt Context fitting recovers more relations Form Meaning Context XIXI MOT addressee speaker giver meets giver Discourse Segment Give meets recipient recipient attentional-focus INV theme theme omitted Peach

  28. MOT addressee speaker ni3 Addressee meets giver giver Discourse Segment gei3 Give meets recipient recipient attentional-focus theme yi2 Aunt Peach But the learner does not yet have phrasal cxns Form Meaning Context XIXI Give INV

  29. ni3 Addressee meets giver gei3 Give meets recipient yi2 Aunt Context bootstraps learning Form Meaning construction ni3-gei3-yi2 subcase of Finite_Clause constructional constituents n : ni3 g : gei3 y : yi2 form constraints n.f meets g.f g.f meets y.f meaning : Give constraints self.m <--> g.m self.m.giver <--> n.m self.m.recipient <--> y.m

  30. A model of context is key to learning • The context model makes it possible for the learning model to: • learn new constructions using contextually available information • learn argument-structure constructions in pro-drop languages

  31. Events + Utterances Understanding an utterance in context Transcripts Schemas + Constructions Context Model Recency Model Analysis + Resolution Context Fitting Simulation Semantic Specification

  32. Context model: Events + Utterances Setting participants, entities, & relations Start Event Event DS Sub-Event Sub-Event

  33. Entities and Relations are instantiated Setting CHI, MOT (incl. body parts) livingroom (incl. ground, ceiling, chair, etc), lotion ds04 admonishing05 speaker = MOT addressee = CHI forcefulness = normal caused_motion01 forceful_motion motion Start apply02 applier = CHI substance = lotion surface = face(CHI) translational_motion03 mover = lotion spg = SPG

  34. The context model is updated dynamically Events • Extended transcript annotation: speech acts & events • Simulator inserts events into context model & updates it with the effects • Some relations persists over time; some don’t. Context Model Recency Model Simulation

  35. Competition-based analyzer finds the best analysis A-GIVE-B-X subj obj2 v obj1 Ref-Exp Give Ref-Exp Ref-Exp @Book @Man Give-Action @Woman Bill Mary book01 recipient giver theme • An analysis is made up of: • A constructional tree • A semantic specification • A set of resolutions Bill gave Mary the book

  36. Combined score that determines best-fit • Syntactic Fit: • Constituency relations • Combine with preferences on non-local elements • Conditioned on syntactic context • Antecedent Fit: • Ability to find referents in the context • Conditioned on syntactic information, feature agreement • Semantic Fit: • Semantic bindings for frame roles • Frame roles’ fillers are scored

  37. ni3 Addressee gei3 Give yi2 Aunt Context Fitting goes beyond resolution Form Meaning Context XIXI MOT addressee speaker giver meets giver Discourse Segment Give meets recipient recipient attentional-focus INV theme theme omitted Peach

  38. Context Fitting, a.k.a. intention reading • Context Fitting takes resolution a step further • considers entire context model, ranked by recency • considers relations amongst entities • heuristically fits from top down, e.g. • discourse-related entities • complex processes • simple processes • other structured and unstructured entities • more heuristics for future events (e.g. in cases of commands or suggestions)

  39. Adult grammar size • ~615 constructions total • ~100 abstract cxns (26 to capture lexical variants) • ~70 phrasal/clausal cxns • ~440 lexical cxns (~260 open class) • ~195 schemas (~120 open class, ~75 closed class)

  40. Starter learner grammar size • No grammatical categories (except interjections) • Lexical items only • ~440 lexical constructions • ~260 open class: schema / ontology meanings • ~40 closed class: pronouns, negation markers, etc • ~60 function words: no meanings • ~195 schemas (~120 open class, ~75 closed class)

  41. The process hierarchy defined in schemas Process Proto_Transitive State_Change State Action Complex_Process Intransitive_State Two_Participant_State Serial_Processes Mental_State Concurrent_Processes Cause_Effect Joint_Motion Caused_Motion

  42. The process hierarchy defined in schemas Action Intransitive_Action Motion Translational_Motion Expression Self_Motion Translational_Self_Motion Force_Application Forceful_Motion Continuous_Force_Application Agentive_Impact

  43. The process hierarchy defined in schemas Action Cause_Change Communication Obtainment Transfer Ingestion Perception Other_Transitive_Action

  44. Events + Utterances Understanding an utterance in context Transcripts reorganize Schemas + Constructions Context Model Recency Model Analysis + Resolution Context Fitting Simulation reinforcement hypothesize Semantic Specification

  45. Hypothesize & Reorganize • Hypothesize: • utterance-driven; • relies on the analysis (SemSpec & context) • operations: compose • Reorganize: • grammar-driven; • can be triggered by usage (to be determined) • operations: generalize

  46. Composing new constructions ni3 Addressee gei3 Give • Compose operation: If roles from different constructions point to the same context element, propose a new construction and set up a meaning binding. Context MOT XIXI Peach giver recipient theme INV

  47. Creating pivot constructions ni3 ni3 Addressee Addressee gei3 yi2 wo3 gei3 Give @Human @Aunt Give • Pivot generalization: Given a phrasal cxn, look for another cxn that shares 1+ constituents. Line up roles and bindings. Create new cxn category for the slot. meets meets giver giver meets meets recipient recipient

  48. Resulting constructions construction ni3-gei3-cat01 constituents ni3, gei3, cat01 meaning : Give constraints self.m.recipient <--> g.m general construction cat01 subcase of Morpheme meaning: @Human construction wo3 subcase of cat01 meaning: @Human construction yi2 subcase of cat01 meaning: @Aunt

  49. Pilot Results: Sample constructions learned • Composed: • Pivot Cxns:

  50. Challenge #1: Non-compositional meaning you a cake Addressee @Cake bake Bake Context • Non-compositional meaning: Search for additional meaning schemas (in context or in general) that relate the meanings of the individual constructions Bake-Event Give-Event baker baked MOT CHI Cake

More Related