1 / 21

Semantics

Semantics. cCS 224n / Lx 237 Tuesday, May 11 2004. With slides borrowed from Jason Eisner. Objects . Three Kinds: Boolean – semantic value of sentences Entities Objects, NPs Maybe space / time specifications Functions Predicates – function returning a boolean

Download Presentation

Semantics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Semantics cCS 224n / Lx 237 Tuesday, May 11 2004 With slides borrowed from Jason Eisner

  2. Objects • Three Kinds: • Boolean – semantic value of sentences • Entities • Objects, NPs • Maybe space / time specifications • Functions • Predicates – function returning a boolean • Functions might return other functions • Functions might take other functions as arguments.

  3. Nouns and their modifiers • expert • g expert(g) • big fat expert • g big(g)  fat(g)  expert(g) • But: bogus expert • Wrong: g bogus(g)  expert(g) • Right: g (bogus(expert))(g) … bogus maps to new concept • Baltimore expert (white-collar expert, TV expert …) • g Related(Baltimore, g)  expert(g) • Or with different intonation: g (Modified-by(Baltimore, expert))(g) • Can’t use Related for that case: law expert and dog catcher = g Related(law,g)  expert(g)  Related(dog, g)  catcher(g) = dog expert and law catcher

  4. Compositional Semantics • We’ve discussed what semantic representations should look like. • But how do we get them from sentences??? • First - parse to get a syntax tree. • Second - look up the semantics for each word. • Third - build the semantics for each constituent • Work from the bottom up • The syntax tree is a “recipe” for how to do it

  5. loves(x,y) died(x) S S x x VP VP NP NP y NP the bucket NP V kicked V loves Compositional Semantics • Add a “sem” feature to each context-free rule • S  NP loves NP • S[sem=loves(x,y)] NP[sem=x] loves NP[sem=y] • Meaning of S depends on meaning of NPs

  6. Compositional Semantics • Instead of S  NP loves NP • S[sem=loves(x,y)]  NP[sem=x] loves NP[sem=y] • might want general rules like S  NP VP: • V[sem=loves]  loves • VP[sem=v(obj)]  V[sem=v] NP[sem=obj] • S[sem=vp(subj)]  NP[sem=subj] VP[sem=vp] • Now George loves Laura has sem=loves(Laura)(George) • In this manner we’ll sketch a version where • Still compute semantics bottom-up • Grammar is in Chomsky Normal Form • So each node has 2 children: 1 function & 1 argument • To get its semantics, apply function to argument!

  7. Sfin NP START Punc . VPfin N nation T -s VPstem Det Every Vstem want Sinf NP George VPinf VPstem T to NP Laura Vstem love

  8. Sfin NP START Punc . VPfin N nation T -s VPstem Det Every loves(G,L) Vstem want Sinf the meaning that we want here: how can we arrange to get it? NP George VPinf VPstem T to NP Laura Vstem love

  9. Sfin NP what function should apply to G to yield the desired blue result? (this is like division!) START Punc . VPfin N nation T -s VPstem Det Every loves(G,L) Vstem want Sinf G NP George VPinf VPstem T to NP Laura Vstem love

  10. Sfin NP START Punc . VPfin N nation T -s VPstem Det Every loves(G,L) Vstem want Sinf x loves(x,L) G NP George VPinf VPstem T to NP Laura Vstem love

  11. Sfin NP x loves(x,L) G x loves(x,L) a a START Punc . VPfin N nation T -s VPstem Det Every loves(G,L) Vstem want Sinf NP George VPinf VPstem T to NP Laura Vstem love We’ll say that“to” is just a bit of syntax thatchanges a VPstem to a VPinf with the same meaning.

  12. Sfin NP x loves(x,L) G x loves(x,L) a a L y x loves(x,y) START Punc . VPfin N nation T -s VPstem Det Every loves(G,L) Vstem want Sinf NP George VPinf VPstem T to NP Laura Vstem love

  13. Sfin NP by analogy START Punc . VPfin x wants(x, loves(G,L)) N nation T -s VPstem Det Every loves(G,L) Vstem want Sinf x loves(x,L) G NP George VPinf x loves(x,L) VPstem T to a a NP Laura Vstem love L y x loves(x,y)

  14. Sfin NP by analogy y x wants(x,y) START Punc . VPfin x wants(x, loves(G,L)) N nation T -s VPstem Det Every loves(G,L) Vstem want Sinf x loves(x,L) G NP George VPinf x loves(x,L) VPstem T to a a NP Laura Vstem love L yx loves(x,y)

  15. x present(wants(x, loves(G,L))) Sfin NP v xpresent(v(x)) START Punc . VPfin x wants(x, loves(G,L)) N nation T -s VPstem Det Every Vstem want Sinf NP George VPinf VPstem T to NP Laura Vstem love

  16. present(wants(every(nation), loves(G,L)))) Sfin NP every(nation) START Punc . VPfin x present(wants(x, loves(G,L))) N nation T -s VPstem Det Every Vstem want Sinf NP George VPinf VPstem T to NP Laura Vstem love

  17. present(wants(every(nation), loves(G,L)))) Sfin NP every(nation) n every(n) START Punc . VPfin present(x wants(x, loves(G,L))) N nation T -s VPstem Det Every Vstem want Sinf nation NP George VPinf VPstem T to NP Laura Vstem love

  18. present(wants(every(nation), loves(G,L)))) Sfin NP START Punc . s assert(s) VPfin N nation T -s VPstem Det Every Vstem want Sinf NP George VPinf VPstem T to NP Laura Vstem love

  19. Sfin NP In Summary: From the Words START assert(present(wants(every(nation), loves(G,L)))) Punc . s assert(s) VPfin N nation T -s VPstem Det Every every nation Vstem want Sinf v x present(v(x)) NP George VPinf y x wants(x,y) G VPstem T to a a NP Laura Vstem love y x loves(x,y) L

  20. So now what? • Now that we have the semantic meaning, what do we do with it? • Huge literature on logical reasoning, and knowledge learning. • Reasoning versus Inference • “John ate a Pizza” • Q:What was eaten by John? A: pizza • “John ordered a pizza, but it came with anchovies. John then yelled at the waiter and stormed out.” • Q: What was eaten by John? A: nothing

  21. Problem 1a Write grammar rules complete with semantic translations that could be added to the grammar fragment, which will parse the above sentence and generate a semantic representation using the own predicate.

More Related