1 / 59

Formal Semantics

Formal Semantics. Slides by Julia Hockenmaier , Laura McGarrity , Bill McCartney, Chris Manning, and Dan Klein. Formal Semantics. It comes in two flavors: Lexical Semantics : The meaning of words

quiana
Download Presentation

Formal Semantics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Formal Semantics Slides by Julia Hockenmaier, Laura McGarrity, Bill McCartney, Chris Manning, and Dan Klein

  2. Formal Semantics It comes in two flavors: • Lexical Semantics: The meaning of words • Compositional semantics: How the meaning of individual units combine to form the meaning of larger units

  3. What is meaning • Meaning ≠ Dictionary entries Dictionaries define words using words. Circularity!

  4. Reference • Referent: the thing/idea in the world that a word refers to • Reference: the relationship between a word and its referent

  5. Reference Barack president Obama The president is the commander-in-chief. = Barack Obama is the commander-in-chief.

  6. Reference Barack president Obama I want to be the president. ≠ I want to be Barack Obama.

  7. Reference • Tooth fairy? • Phoenix? • Winner of the 2016 presidential election?

  8. What is meaning? • Meaning ≠ Dictionary entries • Meaning ≠ Reference

  9. Sense • Sense: The mental representation of a word or phrase, independent of its referent.

  10. Sense ≠ Mental Image • A word may have different mental images for different people. • E.g., “mother” • A word may conjure a typical mental image (a prototype), but can signify atypical examples as well.

  11. Sense v. Reference • A word/phrase may have sense, but no reference: • King of the world • The camel in CIS 8538 • The greatest integer • The • A word may have reference, but no sense: • Proper names: Dan McCloy, Kristi Krein (who are they?!)

  12. Sense v. Reference • A word may have the same referent, but more than one sense: • The morning star / the evening star (Venus) • A word may have one sense, but multiple referents: • Dog, bird

  13. Some semantic relations between words • Hyponymy: subclass • Poodle < dog • Crimson < red • Red < color • Dance < move • Hypernymy: superclass • Synonymy: • Couch/sofa • Manatee / sea cow • Antonymy: • Dead/alive • Married/single

  14. Lexical Decomposition • Word sense can be represented with semantic features:

  15. Compositional Semantics

  16. Compositional Semantics • The study of how meanings of small units combine to form the meaning of larger units The dog chased the cat ≠ The cat chased the dog. ie, the whole does not equal the sum of the parts. The dog chased the cat = The cat was chased by the dog ie, syntax matters to determining meaning.

  17. Principle of Compositionality The meaning of a sentence is determined by the meaning of its words in conjunction with the way they are syntactically combined.

  18. Exceptions to Compositionality • Anomaly: when phrases are well-formed syntactically, but not semantically • Colorless green ideas sleep furiously. (Chomsky) • That bachelor is pregnant.

  19. Exceptions to Compositionality • Metaphor: the use of an expression to refer to something that it does not literally denote in order to suggest a similarity • Time is money. • The walls have ears.

  20. Exceptions to Compositionality • Idioms: Phrases with fixed meanings not composed of literal meanings of the words • Kick the bucket = die (*The bucket was kicked by John.) • When pigs fly = ‘it will never happen’ (*She suspected pigs might fly tomorrow.) • Bite off more than you can chew = ‘to take on too much’ (*He chewed just as much as he bit off.)

  21. Idioms in other languages

  22. Logical Foundations for Compositional Semantics • We need a language for expressing the meaning of words, phrases, and sentences • Many possible choices; we will focus on • First-order predicate logic (FOPL) with types • Lambda calculus

  23. Truth-conditional Semantics • Linguistic expressions • “Bob sings.” • Logical translations • sings(Bob) • but could be p_5789023(a_257890) • Denotation: • [[bob]] = some specific person (in some context) • [[sings(bob)]] = true, in situations where Bob is singing; false, otherwise • Types on translations: • bob: e(ntity) • sings(bob): t(rue or false, a boolean type)

  24. Truth-conditional Semantics Some more complicated logical descriptions of language: • “All girls like a video game.” • x:e . y:e . girl(x)  [video-game(y)  likes(x,y)] • “Alice is a former teacher.” • (former(teacher))(Alice) • “Alice saw the cat before Bob did.” • x:e, y:e, z:e, t1:e, t2:e . cat(x)  see(y)  see(z)  agent(y, Alice)  patient(y, x)  agent(z, Bob)  patient(z, x)  time(y, t1)  time(z, t2)  <(t1, t2)

  25. FOPL Syntax Summary • A set of types T = {t1, … } • A set of constants C = {c1, …}, each associated with a type from T • A set of relations R = {r1, …}, where each ri is a subset of Cn for some n. • A set of variables X = {x1, …} • , , , , , , ., :

  26. Truth-conditional semantics • Proper names: • Refer directly to some entity in the world • Bob: bob • Sentences: • Are either t or f • Bob sings: sings(bob) • So what about verbs and VPs? • sings must combine with bob to produce sings(bob) • The λ-calculus is a notation for functions whose arguments are not yet filled. • sings: λx.sings(x) • This is a predicate, a function that returns a truth value. In this case, it takes a single entity as an argument, so we can write its type as e  t • Adjectives?

  27. Lambda calculus • FOPL + λ (new quantifier) will be our lambda calculus • Intuitively, λ is just a way of creating a function • E.g., girl() is a relation symbol; but λx. girl(x) is a function that takes one argument. • New inference rule: function application (λx. L1(x)) (L2) → L1(L2) E.g.,(λx. x2) (3) → 32 E.g., (λx. sings(x)) (Bob) → sings(Bob) • Lambda calculus lets us describe the meaning of words individually. • Function application (and a few other rules) then lets us combine those meanings to come up with the meaning of larger phrases or sentences.

  28. Compositional Semantics with the λ-calculus • So now we have meanings for the words • How do we know how to combine the words? • Associate a combination rule with each grammar rule: • S : β(α) NP : αVP : β (function application) • VP : λx. α(x) ∧β(x)  VP : αand : ∅VP : β(intersection) • Example:

  29. Composition: Some more examples • Transitive verbs: • likes : λx.λy.likes(y,x) • Two-places predicates, type e(et) • VP “likes Amy” : λy.likes(y,Amy) is just a one-place predicate • Quantifiers: • What does “everyone” mean? • Everyone : λf.x.f(x) • Some problems: • Have to change our NP/VP rule • Won’t work for “Amy likes everyone” • What about “Everyone likes someone”? • Gets tricky quickly!

  30. Composition: Some more examples • Indefinites • The wrong way: • “Bob ate a waffle” : ate(bob,waffle) • “Amy ate a waffle” : ate(amy,waffle) • Better translation: • ∃x.waffle(x) ^ ate(bob, x) • What does the translation of “a” have to be? • What about “the”? • What about “every”?

  31. Denotation • What do we do with the logical form? • It has fewer (no?) ambiguities • Can check the truth-value against a database • More usefully: can add new facts, expressed in language, to an existing relational database • Question-answering: can check whether a statement in a corpus entails a question-answer pair: “Bob sings and dances”  Q:“Who sings?” has answer A:“Bob” • Can chain together facts for story comprehension

  32. Grounding • What does the translation likes : λx. λy. likes(y,x) have to do with actual liking? • Nothing! (unless the denotation model says it does) • Grounding: relating linguistic symbols to perceptual referents • Sometimes a connection to a database entry is enough • Other times, you might insist on connecting “blue” to the appropriate portion of the visual EM spectrum • Or connect “likes” to an emotional sensation • Alternative to grounding: meaning postulates • You could insist, e.g., that likes(y,x) => knows(y,x)

  33. More representation issues • Tense and events • In general, you don’t get far with verbs as predicates • Better to have event variables e • “Alice danced” : danced(Alice) vs. • “Alice danced” : ∃e.dance(e)^agent(e, Alice)^(time(e)<now) • Event variables let you talk about non-trivial tense/aspect structures: “Alice had been dancing when Bob sneezed”

  34. More representation issues • Propositional attitudes (modal logic) • “Bob thinks that I am a gummi bear” • thinks(bob, gummi(me))? • thinks(bob, “He is a gummi bear”)? • Usually, the solution involves intensions (^p) which are, roughly, the set of possible worlds in which predicate p is true. • thinks(bob, ^gummi(me)) • Computationally challenging • Each agent has to model every other agent’s mental state • This comes up all the time in language – • E.g., if you want to talk about what your bill claims that you bought, vs. what you think you bought, vs. what you actually bought.

  35. More representation issues • Multiple quantifiers: “In this country, a woman gives birth every 15 minutes. Our job is to find her, and stop her.” -- Groucho Marx • Deciding between readings • “Bob bought a pumpkin every Halloween.” • “Bob put a warning in every window.”

  36. More representation issues • Other tricky stuff • Adverbs • Non-intersective adjectives • Generalized quantifiers • Generics • “Cats like naps.” • “The players scored a goal.” • Pronouns and anaphora • “If you have a dime, put it in the meter.” • … etc., etc.

  37. Mapping Sentences to Logical Forms

  38. CCG Parsing • Combinatory Categorial Grammar • Lexicalized PCFG • Categories encode argument sequences • A/B means a category that can combine with a B to the right to form an A • A \ B means a category that can combine with a B to the left to form an A • A syntactic parallel to the lambda calculus

  39. Learning to map sentences to logical form • Zettlemoyer and Collins (IJCAI 05, EMNLP 07)

  40. Some Training Examples

  41. CCG Lexicon

  42. Parsing Rules (Combinators) Application Right: X : f(a)  X/Y : f Y : a Left: X : f(a)  Y : a X\Y : f Additional rules: • Composition • Type-raising

  43. CCG Parsing Example

  44. Parsing a Question

  45. Lexical Generation Input Training Example Sentence: Texas borders Kansas. Logical form: borders(Texas, Kansas)

  46. GENLEX • Input: a training example (Si, Li) • Computation: • Create all substrings of consecutive words in Si • Create categories from Li • Create lexical entries that are the cross products of these two sets • Output: Lexicon Λ

  47. GENLEX Cross Product Input Training Example Sentence: Texas borders Kansas. Logical form: borders(Texas, Kansas) Output Lexicon

  48. GENLEX Output Lexicon

  49. Weighted CCG Given a log-linear model with a CCG lexicon Λ, a feature vector f, and weights w: The best parse is: y* = argmax w ∙ f(x,y) where we consider all possible parses y for the sentence x given the lexicon Λ. y

More Related