1 / 33

Learning Narrative Schemas

Learning Narrative Schemas. Nate Chambers, Dan Jurafsky Stanford University. IBM Watson Research Center Visit. Two Joint Tasks. Events in a Narrative. Semantic Roles. suspect, criminal, client, immigrant, journalist, government, ….

glovell
Download Presentation

Learning Narrative Schemas

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Learning Narrative Schemas Nate Chambers, Dan Jurafsky Stanford University IBM Watson Research Center Visit

  2. Two Joint Tasks Events in a Narrative Semantic Roles suspect, criminal, client, immigrant, journalist, government, … police, agent, officer, authorities, troops, official, investigator, …

  3. Scripts Schank and Abelson. 1977. Scripts Plans Goals and Understanding. Lawrence Erlbaum. Mooney and DeJong. 1985. Learning Schemata for NLP. IJCAI-85. • Background knowledge for language understanding Restaurant Script • Hand-coded • Domain dependent

  4. Applications • Coreference • Resolve pronouns (he, she, it, etc.) • Summarization • Inform sentence selection with event confidence scores • Aberration Detection • Detect surprise/unexpected events in text • Story Generation • McIntyre and Lapata, (ACL-2009) • Textual Inference • Does a document infer other events • Selectional Preferences • Use chains to inform argument types

  5. The Protagonist protagonist: (noun) • the principal character in a drama or other literary work • a leading actor, character, or participant in a literary work or real event

  6. Inducing Narrative Relations Chambers and Jurafsky. Unsupervised Learning of Narrative Event Chains. ACL-08 • Dependency parse a document. • Run coreference to cluster entity mentions. • Count pairs of verbs with coreferring arguments. • Use pointwise mutual information to measure relatedness. Narrative Coherence Assumption Verbs sharing coreferring arguments are semantically connected by virtue of narrative discourse structure.

  7. Chain Example (ACL-08)

  8. Schema Example (new) Police, Agent, Authorities Judge, Official Prosecutor, Attorney Suspect, Criminal, Terrorist, … Plea, Guilty, Innocent

  9. Narrative Schemas

  10. Integrating Argument Types • Use verb relations to learn argument types. • Record head nouns of coreferring arguments. The typhoon was downgraded Sunday as it moved inland from the coast, where it killed two people. downgrade-o, move-s, typhoon move-s, kill-s, typhoon downgrade-o, kill-s, typhoon • Use argument types to learn verb relations. • Include argument counts in relation scores.

  11. Learning Schemas

  12. Argument Induction • Induce semantic roles by scoring argument head words. Suspect Government Journalist Monday Member Citizen Client … …

  13. Training Data • 1.2 million New York Times articles • NYT portion of the Gigaword Corpus • David Graff. 2002. English Gigaword. Linguistic Data Consortium. • Stanford Parser • http://nlp.stanford.edu/software/lex-parser.shtml • OpenNLP coreference • http://opennlp.sourceforge.net • Lemmatize verbs and noun arguments.

  14. Learned Examples court, judge, justice, panel, Osteen, circuit, nicolau, sporkin, majority law, ban, rule, constitutionality, conviction, ruling, lawmaker,

  15. Learned Examples company, inc, corp, microsoft, iraq, co, unit, maker, … drug, product, system, test, software, funds, movie, …

  16. Database of Schemas • ~500 unique schemas, 10 events each • Temporal ordering data • Available online soon.

  17. Evaluations • Compared to FrameNet • High precision when overlapping • New type of knowledge not included • Cloze Evaluation • Predict missing events • Far better performance than vanilla distributional approaches

  18. Future Work • Improved information extraction • Extract information across multiple predicates. • Knowledge Organization • Link news articles describing subsequent events. • Core AI Reasoning • Automatic approach to learning causation? • NLP specific tasks • Coreference, summarization, etc.

  19. Thanks! • Unsupervised Learning of Narrative Schemas and their Participants
Nathanael Chambers and Dan Jurafsky
ACL-09, Singapore. 2009. • Unsupervised Learning of Narrative Event Chains
Nathanael Chambers and Dan Jurafsky
ACL-08, Ohio, USA. 2008. • Jointly Combining Implicit Constraints Improves Temporal Ordering
Nathanael Chambers and Dan Jurafsky
EMNLP-08, Waikiki, Hawaii, USA. 2008. • Classifying Temporal Relations Between Events
Nathanael Chambers, Shan Wang, Dan Jurafsky
ACL-07, Prague. 2007.

  20. Cloze Evaluation • Choose a news article at random. • Identify the protagonist. • Extract the narrative event chain. • Randomly remove one event from the chain. • Predict which event was removed.

  21. Cloze Results • Outperform the baseline distributional learning approach by 36% • Including participants improves further by 10%

  22. Comparison to FrameNet • Narrative Schemas • Focuses on events that occur together in a narrative. • FrameNet (Baker et al., 1998) • Focuses on events that share core roles.

  23. Comparison to FrameNet • Narrative Schemas • Focuses on events that occur together in a narrative. • Schemas represent larger situations. • FrameNet (Baker et al., 1998) • Focuses on events that share core roles. • Frames typically represent single events.

  24. Comparison to FrameNet • How similar are schemas to frames? • Find “best” FrameNet frame by event overlap • How similar are schema roles to frame elements? • Evaluate argument types as FrameNet frame elements.

  25. FrameNet Schema Similarity • How many schemas map to frames? • 13 of 20 schemas mapped to a frame • 26 of 78 (33%) verbs are not in FrameNet • Verbs present in FrameNet • 35 of 52 (67%) matched frame • 17 of 52 (33%) did not match

  26. FrameNet Schema Similarity • Why were 33% unaligned? • FrameNet represents subevents as separate frames • Schemas model sequences of events. One Schema Two FrameNet Frames trade rise fall Exchange Change Position on a Scale

  27. FrameNet Argument Similarity • Argument role mapping to frame elements. • 72% of arguments appropriate as frame elements FrameNet frame: Enforcing Frame element: Rule law, ban, rule, constitutionality, conviction, ruling, lawmaker, tax INCORRECT

  28. XX Event Scoring

  29. XX Argument Induction • Induce semantic roles by scoring argument head words. = criminal? How often do events share any coreferring arguments? How often do they share argument ?

  30. Results Chains Schemas Typed Chains Typed Schemas 10.1%

  31. Results • We learned rich narrative structure. • 10.1% improvement over previous work • Induced semantic roles characterizing the participants in a narrative. • Verb relations and their semantic roles can be jointly learned and improve each other’s results. • Selectional preferences improve verb relation learning.

  32. XX Semantic Role Induction • Supervised Learning • PropBank (Palmer et al., 2005), • FrameNet (Baker et al., 1998), • VerbNet (Kipper et al., 2000) • Bootstrapping from a seed corpus • (Swier and Stevenson, 2004), (He and Gildea, 2006) • Unsupervised, pre-defined roles • (Grenegar and Manning 2006) • WordNet inspired • (Green and Dorr, 2005), (Alishahi and Stevenson, 2007) Suspect Government Journalist Monday Member Citizen Client …

More Related