1 / 21

Natural Logic for Textual Inference

Explore the foundations and applications of natural logic in textual inference, including experiments with logical systems and frameworks.

smithjoyce
Download Presentation

Natural Logic for Textual Inference

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Natural Logicfor Textual Inference Bill MacCartney and Christopher D. Manning NLP Group Stanford University 29 June 2007

  2. OK Few western states completely forbid casino gambling. Few states completely forbid gambling. Few or no states completely forbid casino gambling. No Few states completely forbid casino gambling for kids. Few states restrict gambling. Few states or cities completely forbid casino gambling. Introduction • Foundations of Natural Logic • The NatLog System • Experiments with FraCaS • Experiments with RTE • Conclusion Inferences involving monotonicity Few states completely forbid casino gambling. What kind of textual inference system could predict this?

  3. Bos & Markert 2006 FOL &theoremproving pred-arg structure matching Hickl et al. 2006 patternedrelationextraction Romano et al. 2006 lexical/semanticoverlap Introduction • Foundations of Natural Logic • The NatLog System • Experiments with FraCaS • Experiments with RTE • Conclusion Jijkoun & de Rijke 2005 Textual inference:a spectrum of approaches deep,but brittle naturallogic robust,but shallow

  4. Introduction • Foundations of Natural Logic • The NatLog System • Experiments with FraCaS • Experiments with RTE • Conclusion What is natural logic? • A logic whose vehicle of inference is natural language • No formal notation:       • Just words & phrases: All men are mortal… • Focus on a ubiquitous category of inference: monotonicity • I.e., reasoning about the consequences of broadening or narrowing the concepts or constraints in a proposition • Precise, yet sidesteps difficulties of translating to FOL: idioms, intensionality and propositional attitudes, modalities, indexicals,reciprocals,scope ambiguities, quantifiers such as most, reciprocals, anaphoric adjectives, temporal and causal relations, aspect, unselective quantifiers, adverbs of quantification, donkey sentences, generic determiners, … • Aristotle, Lakoff, van Benthem, Sánchez Valencia 1991

  5. Introduction • Foundations of Natural Logic • The NatLog System • Experiments with FraCaS • Experiments with RTE • Conclusion Outline • Introduction • Foundations of Natural Logic • The NatLog System • Experiments with FraCaS • Experiments with RTE • Conclusion

  6. Introduction • Foundations of Natural Logic • The NatLog System • Experiments with FraCaS • Experiments with RTE • Conclusion The entailment relation:  In natural logic, entailment is defined as an ordering relation over expressions of all semantic types (not just sentences)

  7. Introduction • Foundations of Natural Logic • The NatLog System • Experiments with FraCaS • Experiments with RTE • Conclusion Monotonicity of semantic functions In compositional semantics, meanings are seen as functions, and can have various monotonicity properties: Upward-monotone (M) The default: “bigger” inputs yield “bigger” outputs Example: broken. Since chairfurniture, broken chairbroken furniture Heuristic: in a M context, broadening edits preserve truth Downward-monotone (M) Negatives, restrictives, etc.: “bigger” inputs yield “smaller” outputs Example: doesn’t. While hoverfly, doesn’t flydoesn’t hover Heuristic: in a M context, narrowing edits preserve truth Non-monotone (#M) Superlatives, some quantifiers (most, exactly n): neither M nor M Example: most. While penguinbird, most penguins # most birds Heuristic: in a #M context, no edits preserve truth

  8. explicit negation:no, n’t restrictive quantifiers:no, few, at most n didn’t dance didn’t tango few athletes few sprinters negative & restrictive nouns:ban, absence [of], refusal negative & restrictive verbs:lack, fail, prohibit, deny drug ban heroin ban prohibit weapons prohibit guns Introduction • Foundations of Natural Logic • The NatLog System • Experiments with FraCaS • Experiments with RTE • Conclusion prepositions & adverbs:without, except, only the antecedent of a conditional If stocks rise, we’ll get real paidIf stocks soar, we’ll get real paid without clothes without pants Downward monotonicity Downward-monotone constructions are widespread!

  9. Introduction • Foundations of Natural Logic • The NatLog System • Experiments with FraCaS • Experiments with RTE • Conclusion Monotonicity of binary functions • Some quantifiers are best viewed as binary functions • Different arguments can have different monotonicities some Some mammals fly  Some animals fly  Some mammals move all All ducks fly  All mallards fly  All ducks move no No dogs fly  No poodles fly  No dogs hover not every Not every bird flies  Not every animal flies  Not every bird hovers

  10. + – – – + + Few completely states forbid Introduction • Foundations of Natural Logic • The NatLog System • Experiments with FraCaS • Experiments with RTE • Conclusion casino gambling Composition of monotonicity • Composition of functions  composition of monotonicity • Sánchez Valencia: a precise monotonicity calculus for CG Few states completely forbid casino gambling

  11. Introduction • Foundations of Natural Logic • The NatLog System • Experiments with FraCaS • Experiments with RTE • Conclusion The NatLog System textual inference problem linguistic pre-processing 1 alignment 2 entailment classification 3 prediction

  12. Few S completely states VP forbid NP ADVP NP Introduction • Foundations of Natural Logic • The NatLog System • Experiments with FraCaS • Experiments with RTE • Conclusion JJ NNS RB VBD NN NN casino gambling + – – – + + Step 1: Linguistic Pre-processing • Tokenize & parse input sentences (future: & NER & coref & …) • Identify & project monotonicity operators • Problem: PTB-style parse tree  semantic structure! few pattern: JJ < /^[Ff]ew$/ arg1: M on dominating NP __ >+(NP) (NP=proj !> NP) arg2: M on dominating S __ >+(/.*/) (S=proj !> S) Few states completely forbid casino gambling • Solution: specify projections in PTB trees using Tregex

  13. ADV INS ADV SUB DEL ADV Introduction • Foundations of Natural Logic • The NatLog System • Experiments with FraCaS • Experiments with RTE • Conclusion Step 2: Alignment • Alignment = a sequence of atomic edits [cf. Harmeling 07] • Atomic edits over token spans: DEL, INS, SUB, ADV • Limitations: • no easy way to represent movement • no alignments to non-contiguous sets of tokens • Benefits: • well-defined sequence of intermediate forms • can use adaptation of Levenshtein string-edit DP • We haven’t (yet) invested much effort here

  14. Introduction • Foundations of Natural Logic • The NatLog System • Experiments with FraCaS • Experiments with RTE • Conclusion Step 3: Entailment Classification • Atomic edits  atomic entailment problems • Feature representation • Basic features: edit type, monotonicity, “light edit” feature • Lexical features for SUB edits: lemma sim, WN features • Decision tree classifier • Trained on small data set designed to exercise feature space • Outputs an elementary entailment relation:   = # | • Composition of atomic entailment predictions • Fairly intuitive:  º   ,  º   #,  º =  =, etc. • Composition yields global entailment prediction for problem

  15. INS SUB DEL Introduction • Foundations of Natural Logic • The NatLog System • Experiments with FraCaS • Experiments with RTE • Conclusion = (equivalent) = (equivalent)  (forward)  (forward) Entailment model example featurize typeINS mono down isLight true typeSUB mono down isLight false lemSim 0.375 wnSyn 1.0 wnAnto 0.0 wnHypo 0.0 typeDEL mono up isLight false predict compose

  16. Introduction • Foundations of Natural Logic • The NatLog System • Experiments with FraCaS • Experiments with RTE • Conclusion The FraCaS test suite • FraCaS: mid-90s project in computational semantics • 346 “textbook” examples of textual inference problems no unk • 9 sections: quantifiers, plurals, anaphora, ellipsis, … • 3 possible answers: yes, no, unknown (not balanced!) • 55% single-premise, 45% multi-premise (excluded)

  17. Introduction • Foundations of Natural Logic • The NatLog System • Experiments with FraCaS • Experiments with RTE • Conclusion Results on FraCaS by section confusion matrix guess gold

  18. Introduction • Foundations of Natural Logic • The NatLog System • Experiments with FraCaS • Experiments with RTE • Conclusion The RTE3 test suite • RTE: more “natural” textual inference problems • Much longer premises: average 35 words (vs. 11) • Binary classification: yes and no • RTE problems not ideal for NatLog • Many kinds of inference not addressed by NatLog • Big edit distance  propagation of errors from atomic model • Maybe we can achieve high precision on a subset? • Strategy: hybridize with broad-coverage RTE system • As in Bos & Markert 2006

  19. Stanford NatLog pre-processing pre-processing alignment alignment classification classification {yes, no} [–, +] x Introduction • Foundations of Natural Logic • The NatLog System • Experiments with FraCaS • Experiments with RTE • Conclusion threshold(balanced) threshold(optimized) {yes, no} {yes, no} A hybrid RTE system using NatLog

  20. Introduction • Foundations of Natural Logic • The NatLog System • Experiments with FraCaS • Experiments with RTE • Conclusion 25 extraproblems (significant,p < 0.01) Results on RTE3

  21. Introduction • Foundations of Natural Logic • The NatLog System • Experiments with FraCaS • Experiments with RTE • Conclusion :-) Thanks! Questions? Conclusion Natural logic enables precise reasoning about monotonicity, while sidestepping the difficulties of translating to FOL. The NatLog system successfully handles a broad range of such inferences, as demonstrated on the FraCaS test suite. Future work: • Add proof search, to handle multiple-premise inference problems • Consider using CCG parses to facilitate monotonicity projection • Explore the use of more sophisticated alignment models • Bring factive & implicative inferences into the NatLog framework

More Related