1 / 72

Putting Meaning Into Your Trees

Putting Meaning Into Your Trees. Martha Palmer University of Pennsylvania Princeton Cognitive Science Laboratory November 6, 2003. Outline. Introduction Background: WordNet, Levin classes, VerbNet Proposition Bank – capturing shallow semantics Mapping PropBank to VerbNet

booker
Download Presentation

Putting Meaning Into Your Trees

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Putting Meaning Into Your Trees Martha Palmer University of Pennsylvania Princeton Cognitive Science Laboratory November 6, 2003

  2. Outline • Introduction • Background: WordNet, Levin classes, VerbNet • Proposition Bank – capturing shallow semantics • Mapping PropBank to VerbNet • Mapping PropBank to WordNet

  3. Word sense in Machine Translation • Different syntactic frames • John left the room Juan saiu do quarto. (Portuguese) • John left the book on the table. Juan deizou o livro na mesa. • Same syntactic frame? • John left a fortune. Juan deixou uma fortuna.

  4. Ask Jeeves – A Q/A, IR ex. What do you call a successful movie? • Tips on Being a Successful Movie Vampire ... I shall call the police. • Successful Casting Call & Shoot for ``Clash of Empires'' ... thank everyone for their participation in the making of yesterday's movie. • Demme's casting is also highly entertaining, although I wouldn't go so far as to call it successful. This movie's resemblance to its predecessor is pretty vague... • VHS Movies: Successful Cold Call Selling: Over 100 New Ideas, Scripts, and Examples from the Nation's Foremost Sales Trainer. Blockbuster

  5. Ask Jeeves – filtering w/ POS tag What do you call a successful movie? • Tips on Being a Successful Movie Vampire ... I shall call the police. • Successful Casting Call & Shoot for ``Clash of Empires'' ... thank everyone for their participation in the making of yesterday's movie. • Demme's casting is also highly entertaining, although I wouldn't go so far as to call it successful. This movie's resemblance to its predecessor is pretty vague... • VHS Movies: Successful Cold Call Selling: Over 100 New Ideas, Scripts, and Examples from the Nation's Foremost Sales Trainer.

  6. Filtering out “call the police” call(you,movie,what) call(you,police) Syntax

  7. English lexical resource is required • That provides sets of possible syntactic frames for verbs. • And provides clear, replicable sense distinctions. AskJeeves: Who do you call for a good electronic lexical database for English?

  8. WordNet – call, 28 senses • name, call -- (assign a specified, proper name to; "They named their son David"; …) -> LABEL 2. call, telephone, call up, phone, ring -- (get or try to get into communication (with someone) by telephone; "I tried to call you all night"; …) ->TELECOMMUNICATE 3. call-- (ascribe a quality to or give a name of a common noun that reflects a quality; "He called me a bastard"; …) -> LABEL 4. call, send for -- (order, request, or command to come; "She was called into the director's office"; "Call the police!") -> ORDER

  9. WordNet – Princeton (Miller 1985, Fellbaum 1998) • On-line lexical reference (dictionary) • Nouns, verbs, adjectives, and adverbs grouped into synonym sets • Other relations include hypernyms (ISA), antonyms, meronyms • Limitations as a computational lexicon • Contains little syntactic information • No explicit predicate argument structures • No systematic extension of basic senses • Sense distinctions are very fine-grained, ITA 73% • No hierarchical entries

  10. Levin classes (Levin, 1993) • 3100 verbs, 47 top level classes, 193 second and third level • Each class has a syntactic signature based on alternations. John broke the jar. / The jar broke. / Jars break easily. John cut the bread. / *The bread cut. / Bread cuts easily. John hit the wall. / *The wall hit. / *Walls hit easily.

  11. Levin classes (Levin, 1993) • Verb class hierarchy: 3100 verbs, 47 top level classes, 193 • Each class has a syntactic signature based on alternations. John broke the jar. / The jar broke. / Jars break easily. change-of-state John cut the bread. / *The bread cut. / Bread cuts easily. change-of-state, recognizable action, sharp instrument John hit the wall. / *The wall hit. / *Walls hit easily. contact, exertion of force

  12. Confusions in Levin classes? • Not semantically homogenous • {braid, clip, file, powder, pluck, etc...} • Multiple class listings • homonymy or polysemy? • Conflicting alternations? • Carry verbs disallow the Conative, (*she carried at the ball), but include {push,pull,shove,kick,draw,yank,tug} • also in Push/pull class, does take the Conative (she kicked at the ball)

  13. Intersective Levin Classes “apart” CH-STATE “across the room” CH-LOC “at” ¬CH-LOC Dang, Kipper & Palmer, ACL98

  14. Intersective Levin Classes • More syntactically and semantically coherent • sets of syntactic patterns • explicit semantic components • relations between senses • VERBNET www.cis.upenn.edu/verbnet Dang, Kipper & Palmer, IJCAI00, Coling00

  15. VerbNet – Karin Kipper • Class entries: • Capture generalizations about verb behavior • Organized hierarchically • Members have common semantic elements, semantic roles and syntactic frames • Verb entries: • Refer to a set of classes (different senses) • each class member linked to WN synset(s) (not all WN senses are covered)

  16. Semantic role labels: Christiane broke the LCD projector. break (agent(Christiane), patient(LCD-projector)) cause(agent(Christiane), broken(LCD-projector)) agent(A) -> intentional(A), sentient(A), causer(A), affector(A) patient(P) -> affected(P), change(P),…

  17. PropBank Hand built resources vs. Real data • VerbNet is based on linguistic theory – how useful is it? • How well does it correspond to syntactic variations found in naturally occurring text?

  18. Powell met Zhu Rongji battle wrestle join debate Powell and Zhu Rongji met consult Powell met with Zhu Rongji Proposition:meet(Powell, Zhu Rongji) Powell and Zhu Rongji had a meeting Proposition Bank:From Sentences to Propositions meet(Somebody1, Somebody2) . . . When Powell met Zhu Rongji on Thursday they discussed the return of the spy plane. meet(Powell, Zhu) discuss([Powell, Zhu], return(X, plane))

  19. Capturing semantic roles* • George broke [ ARG1 the laser pointer.] • [ARG1 The windows] were broken by the hurricane. • [ARG1 The vase] broke into pieces when it toppled over. SUBJ SUBJ SUBJ *See also Framenet, http://www.icsi.berkeley.edu/~framenet/

  20. English lexical resource is required • That provides sets of possible syntactic frames for verbs with semantic role labels. • And provides clear, replicable sense distinctions.

  21. (S (NP-SBJ Analysts) • (VP have • (VP been • (VP expecting • (NP (NP a GM-Jaguar pact) • (SBAR (WHNP-1that) • (S (NP-SBJ *T*-1) • (VP would • (VP give • (NP the U.S. car maker) • (NP (NP an eventual (ADJP 30 %) stake) • (PP-LOC in (NP the British company)))))))))))) VP have been VP expecting SBAR NP a GM-Jaguar pact WHNP-1 that VP give NP Analysts have been expecting a GM-Jaguar pact that would give the U.S. car maker an eventual 30% stake in the British company. NP the US car maker NP an eventual 30% stake in the British company A TreeBanked Sentence S VP NP-SBJ Analysts NP S VP NP-SBJ *T*-1 would NP PP-LOC

  22. (S Arg0 (NP-SBJ Analysts) • (VP have • (VP been • (VP expecting • Arg1 (NP (NP a GM-Jaguar pact) • (SBAR (WHNP-1that) • (S Arg0 (NP-SBJ *T*-1) • (VP would • (VP give • Arg2 (NP the U.S. car maker) • Arg1 (NP (NP an eventual (ADJP 30 %) stake) • (PP-LOC in (NP the British company)))))))))))) a GM-Jaguar pact Arg0 that would give Arg1 *T*-1 an eventual 30% stake in the British company Arg2 the US car maker expect(Analysts, GM-J pact) give(GM-J pact, US car maker, 30% stake) The same sentence, PropBanked have been expecting Arg1 Arg0 Analysts

  23. Frames File Example: expect Roles: Arg0: expecter Arg1: thing expected Example: Transitive, active: Portfolio managers expect further declines in interest rates. Arg0: Portfolio managers REL: expect Arg1: further declines in interestrates

  24. Frames File example: give Roles: Arg0: giver Arg1: thing given Arg2: entity given to Example: double object The executives gave the chefsa standing ovation. Arg0: The executives REL: gave Arg2: the chefs Arg1: a standing ovation

  25. Word Senses in PropBank • Orders to ignore word sense not feasible for 700+ verbs • Mary left the room • Mary left her daughter-in-law her pearls in her will Frameset leave.01 "move away from": Arg0: entity leaving Arg1: place left Frameset leave.02 "give": Arg0: giver Arg1: thing given Arg2: beneficiary How do these relate to traditional word senses in VerbNet and WordNet?

  26. Annotation procedure • PTB II - Extraction of all sentences with given verb • Create Frame File for that verb Paul Kingsbury • (3100+ lemmas, 4400 framesets,118K predicates) • Over 300 created automatically via VerbNet • First pass: Automatic tagging (Joseph Rosenzweig) • http://www.cis.upenn.edu/~josephr/TIDES/index.html#lexicon • Second pass: Double blind hand correction Paul Kingsbury • Tagging tool highlights discrepancies Scott Cotton • Third pass: Solomonization (adjudication) • Betsy Klipple, Olga Babko-Malaya

  27. Trends in Argument Numbering • Arg0 = agent • Arg1 = direct object / theme / patient • Arg2 = indirect object / benefactive / instrument / attribute / end state • Arg3 = start point / benefactive / instrument / attribute • Arg4 = end point • Per word vs frame level – more general?

  28. Additional tags (arguments or adjuncts?) • Variety of ArgM’s (Arg#>4): • TMP - when? • LOC - where at? • DIR - where to? • MNR - how? • PRP -why? • REC - himself, themselves, each other • PRD -this argument refers to or modifies another • ADV –others

  29. Function tags for Chinese (arguments or adjuncts?) • Variety of ArgM’s (Arg#>4): • TMP - when? • LOC - where at? • DIR - where to? • MNR - how? • PRP -why? • TPC – topic • PRD -this argument refers to or modifies another • ADV –others • CND – conditional • DGR – degree • FRQ - frequency

  30. Additional function tags for Chinese for phrasal verbs- • Correspond to groups of “prepositions” • AS • AT • INTO • ONTO • TO • TOWARDS

  31. Inflection • Verbs also marked for tense/aspect • Passive/Active • Perfect/Progressive • Third singular (is has does was) • Present/Past/Future • Infinitives/Participles/Gerunds/Finites • Modals and negations marked as ArgMs

  32. Frames: Multiple Framesets • Framesets are not necessarily consistent between different senses of the same verb • Framesets are consistent between different verbs that share similar argument structures, (like FrameNet) • Out of the 787 most frequent verbs: • 1 FrameNet – 521 • 2 FrameNet – 169 • 3+ FrameNet - 97 (includes light verbs)

  33. Ergative/Unaccusative Verbs Roles (no ARG0 for unaccusative verbs) Arg1= Logical subject, patient, thing rising Arg2 = EXT, amount risen Arg3* = start point Arg4= end point Sales rose 4% to $3.28 billion from $3.16 billion. The Nasdaq composite index added 1.01 to 456.6 on paltry volume.

  34. Actual data for leave • http://www.cs.rochester.edu/~gildea/PropBank/Sort/ Leave .01 “move away from” Arg0 rel Arg1 Arg3 Leave .02 “give” Arg0 rel Arg1 Arg2 sub-ARG0 obj-ARG1 44 sub-ARG0 20 sub-ARG0 NP-ARG1-with obj-ARG2 17 sub-ARG0 sub-ARG2 ADJP-ARG3-PRD 10 sub-ARG0 sub-ARG1 ADJP-ARG3-PRD 6 sub-ARG0 sub-ARG1 VP-ARG3-PRD 5 NP-ARG1-with obj-ARG2 4 obj-ARG1 3 sub-ARG0 sub-ARG2 VP-ARG3-PRD 3

  35. PropBank/FrameNet Buy Arg0:buyer Arg1:goods Arg2:seller Arg3:rate Arg4:payment Sell Arg0:seller Arg1:goods Arg2:buyer Arg3:rate Arg4:payment More generic, more neutral – maps readily to VN,TR Rambow, et al, PMLB03

  36. Annotator accuracy – ITA 84%

  37. English lexical resource is required • That provides sets of possible syntactic frames for verbs with semantic role labels? • And provides clear, replicable sense distinctions.

  38. English lexical resource is required • That provides sets of possible syntactic frames for verbs with semantic role labels that can be automatically assigned accurately to new text? • And provides clear, replicable sense distinctions.

  39. Automatic Labelling of Semantic Relations • Stochastic Model • Features: • Predicate • Phrase Type • Parse Tree Path • Position (Before/after predicate) • Voice (active/passive) • Head Word Gildea & Jurafsky, CL02, Gildea & Palmer, ACL02

  40. Framenet ≥ 10 inst PropBank PropBank ≥10 instances Gold St. parses 77.0 83.1 Automatic parses 82.0 73.6 79.6 Semantic Role Labelling Accuracy-Known Boundaries • Accuracy of semantic role prediction for known boundaries--the • system is given the constituents to classify. • FrameNet examples (training/test) are handpicked to be unambiguous. • Lower performance with unknown boundaries. • Higherperformance with traces. • Almost evens out.

  41. Additional Automatic Role Labelers • Performance improved from 77% to 88% Colorado • (Gold Standard parses, < 10 instances) • Same features plus • Named Entity tags • Head word POS • For unseen verbs – backoff to automatic verb clusters • SVM’s • Role or not role • For each likely role, for each Arg#, Arg# or not • No overlapping role labels allowed Pradhan, et. al., ICDM03, Sardeneau, et. al, ACL03, Chen & Rambow, EMNLP03, Gildea & Hockemaier, EMNLP03

  42. Word Senses in PropBank • Orders to ignore word sense not feasible for 700+ verbs • Mary left the room • Mary left her daughter-in-law her pearls in her will Frameset leave.01 "move away from": Arg0: entity leaving Arg1: place left Frameset leave.02 "give": Arg0: giver Arg1: thing given Arg2: beneficiary How do these relate to traditional word senses in VerbNet and WordNet?

  43. Mapping from PropBank to VerbNet

  44. Mapping from PB to VerbNet

  45. Mapping from PropBank to VerbNet • Overlap with PropBank framesets • 50,000 PropBank instances • < 50% VN entries, > 85% VN classes • Results • MATCH - 78.63%. (80.90% relaxed) • (VerbNet isn’t just linguistic theory!) • Benefits • Thematic role labels and semantic predicates • Can extend PropBank coverage with VerbNet classes • WordNet sense tags Kingsbury & Kipper, NAACL03, Text Meaning Workshop http://www.cs.rochester.edu/~gildea/VerbNet/

  46. WordNet as a WSD sense inventory • Senses unnecessarily fine-grained? • Word Sense Disambiguation bakeoffs • Senseval1 – Hector, ITA = 95.5% • Senseval2 – WordNet 1.7, ITA verbs = 71% • Groupings of Senseval2 verbs, ITA =82% • Used syntactic and semantic criteria

  47. Groupings Methodology(w/ Dang and Fellbaum) • Double blind groupings, adjudication • Syntactic Criteria (VerbNet was useful) • Distinct subcategorization frames • call him a bastard • call him a taxi • Recognizable alternations – regular sense extensions: • play an instrument • play a song • play a melody on an instrument SIGLEX01, SIGLEX02, JNLE04

  48. Groupings Methodology (cont.) • Semantic Criteria • Differences in semantic classes of arguments • Abstract/concrete, human/animal, animate/inanimate, different instrument types,… • Differences in the number and type of arguments • Often reflected in subcategorization frames • John left the room. • I left my pearls to my daughter-in-law in my will. • Differences in entailments • Change of prior entity or creation of a new entity? • Differences in types of events • Abstract/concrete/mental/emotional/…. • Specialized subject domains

  49. Results – averaged over 28 verbs Dang and Palmer, Siglex02,Dang et al,Coling02 MX – Maximum Entropy WSD, p(sense|context) Features: topic, syntactic constituents,semantic classes +2.5%,+1.5 to +5%,+6%

More Related