1 / 40

Extracting Simplified Statements for Factual Question Generation

Extracting Simplified Statements for Factual Question Generation. Michael Heilman and Noah A. Smith. Automatic Factual Question Generation (QG). Input: text Output: questions for reading assessment (e.g., for a closed-book quiz). We focus on sentence-level factual questions.

essien
Download Presentation

Extracting Simplified Statements for Factual Question Generation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Extracting Simplified Statementsfor Factual Question Generation Michael Heilman and Noah A. Smith

  2. Automatic Factual Question Generation (QG) Input: text Output: questions for reading assessment (e.g., for a closed-book quiz) We focus on sentence-level factual questions.

  3. The Problem In complex sentences, facts can be presented with varied and complex linguistic constructions. …Prime Minister Vladimir V. Putin, the country's paramount leader, cut short a trip to Siberia, returning to Moscow to oversee the federal response. Mr. Putin built his reputation in part on his success at suppressing terrorism, so the attacks could be considered a challenge to his stature….

  4. The Problem In complex sentences, facts can be presented with varied and complex linguistic constructions. main clause …Prime Minister Vladimir V. Putin, the country's paramount leader, cut short a trip to Siberia, returning to Moscow to oversee the federal response. Mr. Putin built his reputation in part on his success at suppressing terrorism, so the attacks could be considered a challenge to his stature….

  5. The Problem In complex sentences, facts can be presented with varied and complex linguistic constructions. appositive main clause …Prime Minister Vladimir V. Putin, the country's paramount leader, cut short a trip to Siberia, returning to Moscow to oversee the federal response. Mr. Putin built his reputation in part on his success at suppressing terrorism, so the attacks could be considered a challenge to his stature….

  6. The Problem In complex sentences, facts can be presented with varied and complex linguistic constructions. appositive main clause …Prime Minister Vladimir V. Putin, the country's paramount leader, cut short a trip to Siberia, returning to Moscow to oversee the federal response. Mr. Putin built his reputation in part on his success at suppressing terrorism, so the attacks could be considered a challenge to his stature…. participial phrase

  7. The Problem In complex sentences, facts can be presented with varied and complex linguistic constructions. appositive main clause …Prime Minister Vladimir V. Putin, the country's paramount leader, cut short a trip to Siberia, returning to Moscow to oversee the federal response. Mr. Putin built his reputation in part on his success at suppressing terrorism, so the attacks could be considered a challenge to his stature…. conjunction of clauses participial phrase

  8. The Problem In complex sentences, facts can be presented with varied and complex linguistic constructions. Output: • Prime Minister Vladimir V. Putin cut short a trip to Siberia. • Prime Minister Vladimir V. Putin was the country's paramount leader. • Prime Minister Vladimir V. Putin returned to Moscow to oversee the federal response. • Mr. Putin built his reputation in part on his success at suppressing terrorism. • The attacks could be considered a challenge to his stature.

  9. The Rest of the Talk Input: complex sentence Output: set of simple declarative sentences Our method: • Uses rules to extract and simplify sentences • Is motivated by linguistic knowledge • Outperformed a sentence compression baseline Easier to convert into questions

  10. Outline • Introduction and motivation • Our Approach • Simplification and extraction operations • Evaluation • Conclusions

  11. Alternative: Sentence Compression Input: Complex sentence Output: Simpler sentence that conveys the main point. Suitable for QG? • Only one output per input • Most methods only delete words Knight & Marcu 2000; Dorr et al. 2003; McDonald 2006; Clarke 2008; Martins & Smith 2009; inter alia

  12. Our Approach • We extract and simplify multiple statements from complex sentences. • We include operations for various syntactic constructions. • encoded with pattern matching rules for trees Similar work: Klebanov et al. 2004

  13. Example: Extracting from Appositives Input: Putin, the Russian Prime Minister, visited Moscow. Desired Output: Putin was the Russian Prime Minister.

  14. Example: Extracting from Appositives ROOT S VP NP NP VBD NP NP , , Putin , the Russian Prime Minister , visited Siberia (noun) (appositive) (mainverb)

  15. Example: Extracting from Appositives NP < (NP=noun !$-- NP $+ (/,/ $++ NP|PP=appositive !$CC|CONJP)) >> (ROOT << /^VB.*/=mainverb) ROOT S VP NP NP VBD NP NP , , Putin , the Russian Prime Minister , visited Siberia (noun) (appositive) (mainverb)

  16. Example: Extracting from Appositives NP VBD NP Putin the Russian Prime Minister visited

  17. Example: Extracting from Appositives NP VBD NP Putin the Russian Prime Minister was Singular past tense form of be

  18. Example: Extracting from Appositives ROOT S VP NP VBD NP was Putin the Russian Prime Minister

  19. Implementation • Representation: phrase structure trees from the Stanford Parser • Syntactic rules are written in the Tregex tree searching language • Tregex operators encode tree relations such as dominance, sisterhood, etc. Klein & Manning 2003 Levy & Andrew 2006

  20. Outline • Introduction and motivation • Our Approach • Simplification and extraction operations • Evaluation • Conclusions

  21. Encoding Linguistic Knowledge Given an input sentence A that is assumed true, we aim to extract sentences B that are also true. Our operations are informed by two phenomena: • semantic entailment • presupposition

  22. Semantic Entailment A entails B: B is true whenever A is true. Levinson 1983

  23. Simplification by Removing Modifiers A: However, Jefferson did not believe the Embargo Act, which restricted trade with Europe, would hurt the American economy. Entailment holds when removing certain types of modifiers.

  24. Simplification by Removing Modifiers non-restrictive relative clause discourse marker A: However, Jefferson did not believe the Embargo Act, which restricted trade with Europe, would hurt the American economy. Entailment holds when removing certain types of modifiers.

  25. Simplification by Removing Modifiers non-restrictive relative clause discourse marker A: However, Jefferson did not believe the Embargo Act, which restricted trade with Europe, would hurt the American economy. B: Jefferson did not believe the Embargo Actwould hurt the American economy. Entailment holds when removing certain types of modifiers.

  26. Extracting from Conjunctions A: Mr. Putin built his reputation in part on his success at suppressing terrorism, so the attacks could be considered a challenge to his stature. B1: Mr. Putin built his reputation in part on his success at suppressing terrorism. B2: The attacks could be considered a challenge to his stature. In most clausal and verbal conjunctions, the individual conjuncts are entailed.

  27. Extracting from Presuppositions In some constructions, B is true regardless of whether the main clause of sentence A is true. • i.e., B is presupposed to be true. negation of main clause A: Hamilton did not like Jefferson, the third U.S. President. B: Jefferson was the third U.S. President. Levinson 1983

  28. Presupposition Triggers Many presuppositions have clear syntactic or lexical associations. Jefferson was the third U.S. President.

  29. (Over)simplified Pseudocode primarily by presupposition Take as input a tree t. Extract a set of declarative sentence trees Textractedfrom constructions in t. For each t’in Textracted : Simplifyt’ by removing modifiers. Extract trees Tconjunctsfrom conjunctions in t’. For each tconjunct in Tconjuncts: Tresult= Tresult{tconjunct} ReturnTresult by entailment

  30. Outline • Introduction and motivation • Our Approach • Simplification and extraction operations • Evaluation • Conclusions

  31. Baselines Dorr et al. 2003 • HedgeTrimmer • A rule-based sentence compression algorithm • Iteratively performs simplifying operations until the input is less than a specified length (15 here). • “Main clause only” • Only the simplified main clause extracted by the full system. • Both baselines produce one output per input.

  32. Research Questions • How long are the simplified outputs and how many are there? • Extracted statements from 25 previously unseen Encyclopedia Britannica articles about cities. • How well do the extracted statements cover the information in the input texts? • % of input words in at least one output. Barzilay & Elhadad 2003

  33. Results: Length & Coverage

  34. Research Questions • How well does our system preserve fluency and correctness? • Two raters judged simplified outputs for fluency and correctness using 1-5 scales. • We averaged the raters’ scores. • Inter-rater agreement: • r = .92 for fluency • r = .82 for correctness

  35. Results: Fluency & Correctness Differences between HedgeTrimmerand Full are statistically significant (p < .05).

  36. Outline • Introduction and motivation • Our Approach • Simplification and extraction operations • Evaluation • Conclusions

  37. Conclusions • Method for extracting simplified declarative statements from complex sentences. • Outperformed a text compression baseline. • More outputs and better coverage • Higher % of fluent and correct outputs • Future work: evaluation of this as a component in a QG system. Heilman & Smith 2010

  38. Questions? Demo & code release available on my website. http://www.cs.cmu.edu/~mheilman

  39. A Whale of a Sentence 133 word sentence from Moby Dick: “As they narrated to each other their unholy adventures, their tales of terror told in words of mirth; as their uncivilized laughter forked upwards out of them, like the flames from the furnace; as to and fro, in their front, the harpooneers wildly gesticulated with their huge pronged forks and dippers; as the wind howled on, and the sea leaped, and the ship groaned and dived, and yet steadfastly shot her red hell further and further into the blackness of the sea and the night, and scornfully champed the white bone in her mouth, and viciously spat round her on all sides; then the rushing Pequod, freighted with savages, and laden with fire, and burning a corpse, and plunging into that blackness of darkness, seemed the material counterpart of her monomaniac commander's soul.” Melville 1851 Gold standard parse:

  40. A Whale of a Sentence System output: The rushing Pequod seemed the material counterpart of her monomaniac commander's soul. They narrated to each other their unholy adventures. Their uncivilized laughter forked upwards out of them. The harpooneers wildly gesticulated with their huge pronged forks and dippers in their front. The wind howled on. The sea leaped. The ship groaned. The ship dived. The ship steadfastly shot her red hell further and further into the blackness of the sea and the night. The ship scornfully champed the white bone in her mouth. The ship viciously spat round her on all sides. The rushing Pequod was freighted with savages. The rushing Pequod was laden with fire. The rushing Pequod was burning a corpse. The rushing Pequod was plunging into that blackness of darkness. Their unholy adventures were their tales of terror told in words of mirth.

More Related