1 / 41

Large-Scale Extraction and Use of Knowledge from Text

Large-Scale Extraction and Use of Knowledge from Text. Peter Clark and Phil Harrison Boeing Research and Technology Seattle, WA. Motivation and Goals. World knowledge is required for many AI tasks especially natural language processing. “The man ate spaghetti with a fork”.

alegria
Download Presentation

Large-Scale Extraction and Use of Knowledge from Text

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Large-Scale Extraction and Use of Knowledge from Text Peter Clark and Phil Harrison Boeing Research and Technology Seattle, WA

  2. Motivation and Goals • World knowledge is required for many AI tasks • especially natural language processing “The man ate spaghetti with a fork” • BUT: Not many knowledge resources are available • This work: Create large knowledge resource • called DART (Discovery& Aggregation of Relations in Text) • based on Schubert’s conjecture • publically available

  3. Outline • Extracting Knowledge from Text • Using the Knowledge • Parsing • Recognizing Textual Entailment • Human Evaluation • Future Work

  4. Extracting Knowledge from Text Schubert’s Conjecture: There is a largely untapped source of general knowledge in texts, lying at a level beneath the explicit assertional content, which can be harnessed. “The camouflaged helicopter landed near the embassy.”  helicopters can land  helicopters can be camouflaged

  5. Extracting Knowledge from Text Implicit, tacit knowledge Shareholders may receive shares. Shares may be owned. Companies may be sold. Newswire Article HUTCHINSON SEES HIGHER PAYOUT. HONG KONG. Mar 2. Li said Hong Kong’s property market remains strong while its economy is performing better than forecast. Hong Kong Electric reorganized and will spin off its non-electricity related activities. Hongkong Electric shareholders will receive one share in the new subsidiary for every owned share in the sold company. Li said the decision to spin off …

  6. Extracting Knowledge from Text Approach: • Parse each sentence • Extract “tuples” from selected nodes in the parse tree • Tuple = parse fragment retaining head words only "The lazy men from the city walked to the fancy store" (S "man" "walk" (PP "to" "store")) (NPN "man" "from" "city") (AN "lazy" "man") (VPN "walk" "to" "store") (AN "fancy" "store") “Men can walk to stores” “Men can be from cities” “Men can be lazy” “Stores can be walked to” “Stores can be fancy”

  7. How Tuples are Extracted from the Parse Tree Extracting Knowledge from Text "The lazy men from the city walked to the fancy store" (S "man" "walk" (PP "to" "store")) (NPN "man" "from" "city") (VPN "walk" "to" "store") (AN "lazy" "man") S +-----------------+------------------+ NP VP +--------+---------+ +------------+---+ NP PP VP PP +------+--+ +-----+--+ | +---------+--+ DET N P NP V P NP | +--+--+ | +--+---+ | | +------+---+ "the" A N "from" DET N "walked" "to" DET N | | | | | +---+--+ "lazy" "men" "the" "city" "the" A N | | "fancy" "store" (AN "fancy" "store")

  8. Extracting Knowledge from Text Some details… • Coordinate structures expanded • Questions transformed "The men and women ate beans and rice" (S "man" "eat" "bean") (S "woman" "eat" "bean") (S "man" "eat" "rice") (S "woman" "eat" "rice") "Which book did John bring?" (S (PN "John") "bring" "book")

  9. normalize the structures into 12 canonical forms: (AN "small" "hotel") "Hotels can be small." (ANPN "subject" "agreement" "to" "approval") "Agreements can be subject to approvals." (NN "drug" "distributor") "There can be drug distributors." (NV "bus" "carry") "Buses can carry [something/someone] (NPN "sentence" "for" "offence") "Sentences can be for offences." (NVN "critic" "claim" "thing") "Critics can claim things." (NVPN "person" "go" "into" "room") "People can go into rooms." (NVNPN "democrat" "win" "seat" "in" "election") "Democrats can win seats in elections." (QN "year" "contract") "Contracts can be measured in years." (VN "find" "spider") "Spiders can be found." (VPN "refer" "to" "business") "Referring can be to businesses." (VNPN "educate" "person" "at" "college") "People can be educated at colleges.”

  10. Generating the DART Database • Run over Reuters + British National Corpus • 2 months CPU • Result: the DART database (Discovery and Aggregation of Relations in Text) • 23 million propositions (110 million with duplicates)

  11. Some Snapshots into the Database • NV propositions about “planes” 294 (nv "plane" "carry") ; "planes can carry." 262 (nv "plane" "land") ; "planes can land." 234 (nv "plane" "crash") ; "planes can crash." 162 (nv "plane" "have") ; "planes can have." 159 (nv "plane" "take off") ; "planes can take off." 100 (nv "plane" "leave") ; "planes can leave." 85 (nv "plane" "bomb") ; "planes can bomb." 78 (nv "plane" "arrive") ; "planes can arrive.“ ….. 1 (nv "plane" "boast") ; "planes can boast." 1 (nv "plane" "battle") ; "planes can battle." 1 (nv "plane" "assist") ; "planes can assist." 1 (nv "plane" "age") ; "planes can age." 1 (nv "plane" "advise") ; "planes can advise."

  12. Some Snapshots into the Database • VN propositions about “planes” 151 (vn "hijack" "plane") ; "planes can be hijacked." 149 (vn "buy" "plane") ; "planes can be bought." 142 (vn "use" "plane") ; "planes can be used." 108 (vn "board" "plane") ; "planes can be boarded." 101 (vn "send" "plane") ; "planes can be sent." 89 (vn "fly" "plane") ; "planes can be flown.” 87 (vn "have" "plane") ; "planes can be had." 78 (vn "sell" "plane") ; "planes can be sold." 74 (vn "operate" "plane") ; "planes can be operated." 70 (vn "build" "plane") ; "planes can be built." …..

  13. Some Snapshots into the Database • NVNPN tuples about “planes” 9 (nvnpn "plane" "leave" "place" "for" "place") ; "planes can leave places for places." 8 (nvnpn "plane" "take" "person" "to" "place") ; "planes can take people to places." 7 (nvnpn "plane" "carry" "passenger" "on" "flight") ; "planes can carry passengers on flights." 7 (nvnpn "plane" "carry" "crew" "on" "flight") ; "planes can carry crews on flights." 6 (nvnpn "state" "sell" "plane" "to" "country") ; "states can sell planes to countries." 6 (nvnpn "plane" "have" "design" "with" "capability") ; "planes can have designs with capabilities." 6 (nvnpn "plane" "go" "thing" "to" "place") ; "planes can go things to places." 6 (nvnpn "person" "hijack" "plane" "to" "place") ; "people can hijack planes to places." 5 (nvnpn "plane" "police" "zone" "in" "place") ; "planes can police zones in places."

  14. Some Snapshots into the Database • What can be seen through? ( 22 vpn "see" "through" "window") ( 10 vpn "see" "through" "glass") ( 5 vpn "see" "through" "binocular") ( 3 vpn "see" "through" "porthole") ( 2 vpn "see" "through" "sky") ( 2 vpn "see" "through" "telescope") ( 2 vpn "see" "through" "keyhole") …  ( 37 vpn "see" "through" "person") ( 9 vpn "see" "through" "end") ( 4 vpn "see" "through" "hype") ( 1 vpn "see" "through" "dress") …. ?

  15. Some Snapshots into the Database • What color are sheep? 76 (AN "black" "sheep") Sheep can be black. 8 (AN "white" "sheep") Sheep can be white. 4 (AN "brown" "sheep") Sheep can be brown. 2 (AN “green" "sheep") Sheep can be green. 2 (AN “blue" "sheep") Sheep can be blue.

  16. Some Snapshots into the Database • What color are sheep? 76 (AN "black" "sheep") Sheep can be black. 8 (AN "white" "sheep") Sheep can be white. 4 (AN "brown" "sheep") Sheep can be brown. 2 (AN “green" "sheep") Sheep can be green. 2 (AN “blue" "sheep") Sheep can be blue. ?

  17. Some Snapshots into the Database • What color are sheep? 76 (AN "black" "sheep") Sheep can be black. 8 (AN "white" "sheep") Sheep can be white. 4 (AN "brown" "sheep") Sheep can be brown. 2 (AN “green" "sheep") Sheep can be green. 2 (AN “blue" "sheep") Sheep can be blue. ? An Irishman said, for no apparent reason, "I wear a green pullover because in Ireland we have green sheep."

  18. Some Snapshots into the Database • What color are sheep? 76 (AN "black" "sheep") Sheep can be black. 8 (AN "white" "sheep") Sheep can be white. 4 (AN "brown" "sheep") Sheep can be brown. 2 (AN “green" "sheep") Sheep can be green. 2 (AN “blue" "sheep") Sheep can be blue. ?

  19. Sources of Error • Misparses • Bad part of speech assignment • “coal mine” → “coals can mine” • Textual statements that don’t reflect reality • jokes, metaphor, hypotheticals • Negation that gets lost • “I don’t believe in green sheep” → “sheep can be green” • Rhetorical vocabulary taken literally • “The cost of certain foods rose” → “foods can be certain” • Overgeneralization to something meaningless • “Things can be of things”

  20. Interpreting the frequencies • Can’t rely on raw frequency counts • common words → high frequency counts • Need to normalize for prior probabilities • Mutual Information: • how much the observation goes beyond chance observed frequency of w1 and w2 together MI = log random chance that w1 and w2 will be together

  21. Outline • Extracting Knowledge from Text • Using the Knowledge • Parsing • Recognizing Textual Entailment • Human Evaluation • Future Work

  22. Using the DART Database: 1. Improving Parsing • Ambiguous! → “ate using a fork”? → “eating spaghetti with a fork inside it”? DART propositions can help! “The man ate spaghetti with a fork” 4 examples in DART (VPN “eat” “with” “fork”) (NPN “spaghetti” “with” ‘fork”) 0 examples in DART So: bias the parser to prefer “eat with fork”

  23. Using the DART Database: 1. Improving Parsing Count = 4, MI = 0.74 Count = 0, MI = 0 (NPN “spaghetti” “with” ‘fork”)? (VPN “eat” “with” “fork”)? ? eat OR spaghetti eat spaghetti with with a fork a fork

  24. Using the DART Database: 1. Improving Parsing Count = 4, MI = 0.74 Count = 0, MI = 0 (NPN “spaghetti” “with” ‘fork”)? (VPN “eat” “with” “fork”)? ? eat spaghetti eat spaghetti with with a fork a fork 

  25. Using the DART Database: 1. Improving Parsing Mean Relative Precision (%): Mean Relative Recall (%): Without tuples: 46.3 76.9 With tuples: 47.1 77.6 • Use the Brown corpus • 1 million words, with “gold” parses • Measure parser performance without/with DART Evaluation • Often cases with zero-frequency tuple counts • Tradeoff between tuple bias vs. built-in parser bias

  26. Outline • Extracting Knowledge from Text • Using the Knowledge • Parsing • Recognizing Textual Entailment • Human Evaluation • Future Work

  27. Uses: 2. Recognizing Textual Entailment (RTE) • The task: Given T, does H “reasonably” follow? • Annual competition • Common approach = use paraphrase rules, e.g., T: The president visited Iraq. H: The president traveled to Iraq. Answer: Yes IF X visits YTHENX travels to Y

  28. Uses: 2. Recognizing Textual Entailment (RTE) • BUT: rules can be overgeneral: • “Joe shot Mary” → “Joe injured Mary” (plausible) • “Joe shot the gun” → “Joe injured the gun” (!!!) • Some work on type restrictions on X, Y • BUT: limited success (Y = animate#n1 or artifact#n1) • DART propositions can help! IF X shoots YTHENX injures Y

  29. Uses: 2. Recognizing Textual Entailment (RTE) No injured guns Rule: IF X shoots YTHENX injures Y No! Is this instantiation plausible? ? IF Joe shoots a gunTHENJoe injures a gun

  30. Uses: 2. Recognizing Textual Entailment (RTE) Average Precision of rule-based entailments for: Confidence Assignment Method: RTE3 RTE4 None: 0.592 0.650 Original rule confidence: 0.634 0.712 Tuple (DART) derived confidence 0.641 0.728

  31. Outline • Extracting Knowledge from Text • Using the Knowledge • Parsing • Recognizing Textual Entailment • Human Evaluation • Future Work

  32. Assessing the DART Database: Human Evaluation How much “junk” does DART contain? • Subjects rated random propositions in DART • true / partially true / unsure / mainly false / false Score (1-5) Tuple in DART Verbalized proposition (nvn "person" "enter" "place") "People can enter places."  (an "french" "counterpart") “Counterparts can be french."  (npn "sanction" "against" "place") “Sanctions can be against places"  (npn “people” “love” “wall”) “People can love walls.” (nv “coal” “mine") “Coals can mine"

  33. Assessing the DART Database: Human Evaluation • Divide propositions into 4 “buckets” • Frequencies: 1, 2-10, 11-100, >100 • 12 judges rated ~300 propositions each (~75 per bucket) • ~3600 total ~70% of propositions seen >10 times were judged true/partially true Fraction true/ partially true Frequency of proposition in DART

  34. Outline • Extracting Knowledge from Text • Using the Knowledge • Parsing • Recognizing Textual Entailment • Human Evaluation • Discussion and Future Work

  35. Other Resources • KNEXT – inspiration for this work • we’ve expanded size, types of proposition, and evaluation • TextRunner • larger, less structured database of triples (only), e.g., • ConceptNet • by Web volunteers, ~20 relations, less structured, e.g. • (“iodine” “will kill” “the lactic bacteria”) • (“my friends” “had” “a car”) • (CapableOf “a car” “crash”) • (UsedFor “a car” “getting to work”)

  36. Future Directions: Generalizing DART propositions • Biggest limitation: • Still often near-zero tuples for “good” propositions • Can we generalize the propositions in DART? • thus cover the zero/near-zero cases

  37. Future Directions: Generalizing DART propositions  • (NPN “giraffe” “in” “zoo”) frequency in DART = 0 but… • (NPN “lion” “in” “zoo”) • (NPN “monkey” “in” “zoo”) • (NPN “elephant” “in” “zoo”) frequency in DART > 0 ? • (NPN “animal” “in” “zoo”) ? • (NPN animal#n#1 “in” “zoo”) • (NPN lion#n#1 “in” “zoo”) • (NPN monkey#n#1 “in” “zoo”) • (NPN elephant#n#1 “in” “zoo”) ?

  38. Future Directions: Finding inter-proposition relationships • Use part of DART to disambiguate other parts, eg • (NN “car” “engine”) “There can be car engines” • what is the relation between “car” and “engine”? • other tuples can tell us what it is! e.g.: • (VNPN “car” “power” • “by” “engine”) “Cars can be powered by engines” • (NPN “engine” “in” “car”) “Engines can be in cars”

  39. Future Directions: Natural Language Understanding • Use DART to fill in implicit knowledge for NLP “John had studied hard, but he was still worried” worried about what? DART: (ANPN “worried” “person” “about” ?X) → ?X = survival, sellof, plan, export, exam, delay, … studied for what? DART: (VPN “study” “for” ?X) → ?X = ..., exam, project, …

  40. Future Directions: Natural Language Understanding • Use DART to fill in implicit knowledge for NLP “John had studied hard, but he was still worried” worried about what? DART: (ANPN “worried” “person” “about” ?X) → ?X = survival, sellof, plan, export, exam, delay, … studied for what? DART: (VPN “study” “for” ?X) → ?X = ..., exam, …

  41. Summary • DART(Discovery and Aggregation of Relations in Text) • 23 million unique, general propositions • acquired through text mining • Appears to be useful • Improves parsing • Improves recognizing textual entailment • Judges say ~50% to 70% is true / partially true • Many other ways it can be used! • Give it a try! (Google “DART database”) Thank you!

More Related