1 / 48

COMP 4060 Natural Language Processing

COMP 4060 Natural Language Processing. Semantics. Semantics. Semantics I General Introduction Types of Semantics From Syntax to Semantics Semantics II Desiderata for Representation Logic-based Semantics. Semantics I. Semantics. Distinguish between

shayna
Download Presentation

COMP 4060 Natural Language Processing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. COMP 4060 Natural Language Processing Semantics

  2. Semantics • Semantics I • General Introduction • Types of Semantics • From Syntax to Semantics • Semantics II • Desiderata for Representation • Logic-based Semantics

  3. Semantics I

  4. Semantics Distinguish between • surface structure (syntactic structure) and • deep structure (semantic structure) of sentences. Different forms of Semantic Representation • logic formalisms • ontology / semantic representation languages • Case Frame Structures (Filmore) • Conceptual Dependy Theory (Schank) • DL and similar KR languages • Ontologies

  5. Semantic Representations Semantic Representation based on some kind of (formal) representation language: • Semantics Networks • Conceptual Dependency Graphs • Case Frames • Ontologies • Description Logics and similar Knowledge Rrepresentation languages

  6. Constructing a Semantic Representation General: • Start with surface structure derived from parser. • Map surface structure to semantic structure • Use phrases as sub-structures. • Find concepts and representations for central phrases (e.g. VP, NP, then PP) • Assign phrases to appropriate roles around central concepts (e.g. bind PP into VP representation).

  7. Ontology (Interlingua) approach • Ontology: a language-independent classification of objects, events, relations. • A Semantic Lexicon which connects lexical items to nodes (concepts) in the ontology. • An Analyzer that constructs Interlingua representations and selects appropriate one.

  8. Semantic Lexicon • Provides a syntactic context for the appearance of the lexical item. • Provides a mapping for the lexical itemto a node in the ontology (or more complex associations). • Provides connections from the syntactic context to semantic roles and constraints on these roles.

  9. Deriving Basic Semantic Dependency Deriving Basic Semantic Dependency (a toy example) Input: John makes tools Syntactic Analysis: cat verb tense present subject   root john cat noun-proper object   root     tool cat noun number plural

  10. Lexicon Entries for Johnand tool John-n1 syn-struc root john cat noun-proper sem-struchuman name john gender male tool-n1 syn-struc root tool cat n sem-struc tool

  11. Meaning Representation - Example make Relevant Extract from the Specification of the Ontological Concept Used to Describe the Appropriate Meaning ofmake: manufacturing-activity... agent human theme artifact …

  12. Relevant parts of the (appropriate senses of the) lexicon entries for John and tool John-n1syn-struc root john cat noun-proper sem-struchuman name john gender male tool-n1syn-struc root tool cat n sem-struc tool

  13. Semantic Dependency Component The basic semantic dependency component of the TMR forJohn makes tools manufacturing-activity-7 agent uman-3 theme set-1 element tool cardinality > 1 …

  14. try-v3 syn-struc root try cat v subj root $var1 cat n xcomp root $var2 cat v form OR infinitive gerund sem-struc set-1 element-type refsem-1 cardinality >=1 refsem-1 sem event agent ^$var1 effect refsem-2 modality modality-type epiteuctic modality-scope refsem-2 modality-value < 1 refsem-2 value ^$var2 sem event

  15. Constructing an Interlingua Representation For each syntactic analysis: • Access all semantic mappings and contexts for each lexical item. • Create all possible semantic representations. • Test them for coherency of structure and content.

  16. “Why is Iraq developing weapons of mass destruction?”

  17. Word sense disambiguation • Constraint checking • making sure the constraints imposed on context are met • Graph traversal • is-a links are inexpensive • otherlinks are more expensive • The “cheapest” structure is the most coherent

  18. Semantics II

  19. Desiderata for a Semantic Representation • Verifiability – semantic representation must be compatible with knowledge (base) of the system. • Canonical Form - assign same representation to different surface expressions which have essentially the same meaning • Ambiguity and Vagueness – representation should (in relation to knowledge base or information system access etc.) be unambiguous and precise

  20. Semantics - Connecting Words and Worlds Semantic Representation NL Input Knowledge Representation World State (KB: T-Box, A-Box) NL Output

  21. Representation of Meaning Representation of meaning for natural language sentences: • Semantic Representation Language (in most cases) = some kind of formal language + semantic primitives • For example: First Order Predicate Logicwithspecificset ofpredicatesandfunctions

  22. Semantic Representations Semantic Representationbased on some form of (formal) Representation Language. • Semantics Networks • Conceptual Dependency Graphs • Case Frames • Ontologies • DL and similar KR languages • First-Order Predicate Logic

  23. Example - NL Database Access Imagine a database access using natural language, i.e. questions to the DB posed in natural language. Example:DB of courses in the CS department Pose questions like: • Who is teaching Advanced AI in Fall 2008? • Is John Anderson teaching this term? • What is Jacky Baltes teaching this term? • Who is teaching AI at the University of Winnipeg? • Who is teaching an AI related course this term?

  24. Example Story: My car was stolen two weeks ago. They found it last week. • direct representation of meaning • knowledge • inference

  25. Example Primitives in logic language FOPL: my car as individual constant my_car, car_1 can make statement owns(car_1, I) about ownership of car owns(car_1, Speaker) 2-place predicate owns with one place for the object / car and one place for the owner; filled with variable or constant owns(car_1, Speaker) Someone owns car_1.x: owns(car_1, x) I own all cars.x: car(x)  owns(x, Speaker)

  26. Example Primitives in logic language FOPL: stolen as predicate applied to car stolen(car_1) as event, specified with variable for event and constant for specific event stolen-evente,x: event(e)  stolen(e,x)  x= car_1 ore,x: event(e)  stolen(e, car_1) can make additional specifications, e.g.tense; time; location e,x: event(e)  stolen(e, car_1)  past(e)  time(e)=UT-2weeks / time(e,UT-2weeks)  loc(e)=street#1 event time before utterance time utterance time - 2 weeks refers to identified street

  27. Example Primitives in logic language FOPL: They found it last week. found(car_1,t)  time(t)  t=(UT-1week)

  28. NL and Logic Levels of Representation and Transformation • direct representation of meaning • translation into logic expression • knowledge • stored information about relations etc., e.g. as rules • ontology; terminology; proper axioms • inference • gain additional information, conclusions • combine semantic representation and knowledge

  29. Example concrete world description car (my_car) stolen (my_car, t1), owns (speaker, my_car) found (police, my-car, t2) t1<t2 stolen (x, t1)  owns (y, x) and found (police, x, t2) implies has (y, x, t3)for some timepointst1, t2, t3with t1<t2<t3 general world knowledge What can you inferif you instantiate x with my_car?

  30. Reichenbach's Approach to English Tenses U Time of Utterance R Reference Time E Event Time Fig. 14.4. from Jurafsky and Martin, p. 530

  31. Example car (my_car) stolen (my_car, t1), owns (speaker, my_car) found (police, my-car, t2) t1<t2 stolen (x, t1)  owns (y, x) and found (police, x, t2) implies has (y, x, t3)for some timepointst1, t2, t3with t1<t2<t3 stolen(my_car, t1)  owns (speaker, my_car)  found (police, my-car, t2) has (speaker, my_car, t3) pattern matching with variable binding: unification; inference

  32. Example stolen (x, t1)  owns (y, x) implication? Express that, if something is stolen, the owner does not have it anymore!

  33. Predicate-Argument Structure • Verb-centered approach • Thematic roles, case roles • Describe semantic structure based on verb and associated roles filled by other parts of the sentence (phrases). • Representation using e.g. logic: • Transform structured input sentence (syntax!) into expression in predicate logic. • Usually based on central predicate, the verb, or equivalent, like ‘be’+ adjective etc. • Other parts of the sentence directly related to the verb go into the central predicate.

  34. Verb Subcategorization Consider possiblesubcat frames of verbs. Example: 3 different kinds ofwant: 1. NP want NPI want money. want1(Speaker, money) 2. NP want Inf-VP He wants to go home. want2(he, to_go_home) 3. NP want NP Inf-VP I want him to go away. want3(I, him, to_go_away)

  35. Example - Restaurant 'Maharani' Example: Restaurant 'Maharani' • Maharani serves vegetarian food. • Maharani is a vegetarian restaurant. • Maharani is close to ICSI. Write down logical formulas representing the three different sentences.

  36. Logic Formalisms Lambda Calculus

  37. Semantics - Lambda Calculus 1 Logic representations often involve Lambda-Calculus: • represent central phrases (e.g. verb) as -expressions • -expression is like a function which can be applied to terms • insert semantic representation of complement or modifier phrases etc. in place of variables x, y: loves (x, y) FOPL sentence x y loves (x, y) -expression x y loves (x, y) (John)  y loves (John, y)

  38. Semantics - Lambda Calculus 2 Transform sentence into lambda-expression: “AI Caramba is close to ICSI.” specific: close-to (AI Caramba, ICSI) general: x,y: close-to (x, y)  x=AI Caramba  y=ICSI Lambda Conversion form -expression: x y: close-to (x, y) (AI Caramba) Lambda Reduction apply -expression y: close-to (AI Caramba, y) close-to (AI Caramba, ICSI)

  39. Semantics - Lambda Calculus 3 • Lambda Expressions as basis for semantic representations • attached to words and syntactic categories in grammar rules • passed between nodes during parsing, according to grammar Example: semantics of the verb 'serve' Verb  serve {x y e IS-A(e, Serving)  Server(e, y)  Served(e, x)} Reification denotes the use of predicates as constants. Allows the use of "predicates over predicates", e.g. IS-A (serving, event) or IS-A (restaurant, location). e: serving - event, action, verb reification

  40. Semantics - Lambda Calculus 4 Lambda Expressions are constructed from central expression, inserting semantic representations for subject and complement phrases: Verb  serve {x y e IS-A(e, Serving)  Server(e,y)  Served(e, x)} Fill in appropriate expressions fory,x, e.g.Ay Carambafor yandsteakfor x, derived from direct NP = object NP of the sentence, and y as complement subject NP to the verb. e: serving - S, event y: restaurant - NP, subj. x: food - NP, dir. obj.

  41. Semantics - Lambda Calculus 5 Complete semantic representation is produced by combining semantic feature structures of phrases in the sentence, according to extended grammar rules. Verb serves {x y e IS-A(e,Serving) Server(e, y) Served (e, x)} Apply lambda-expression representing the verb semantics to semantic representations of NPs: {x y e IS-A (e,Serving) Server (e, y) Served(e, x)} Successively apply the lambda-expression (for serves in the example above), filling thex-position with the semantics of the object-NP, and the y-position with the representation of the subject-NP.

  42. Semantics - Lambda Calculus 6 Extend the grammar with semantic attachments, e.g. NP  ProperNoun {ProperNoun.sem} The "base" semantic attachment is determined through access to a lexicon or an ontology. It corresponds to the concept associated with the lexical word, or in the simplest form just the lexical word. Example: Ay Carambaas individual constant ormeatas (reified) concept.

  43. Semantics - Lambda Calculus 7 Constructive Semantics During parsing, these semantic attachments are combined, according to the grammar rules, to form more complete representations, finally covering the whole sentence. Combine and pass upwards semantic attachments, e.g. S  NPVP{VP.sem {NP.sem}} VP  VerbNP{Verb.sem {NP.sem}}

  44. Semantic Representation in BeRP Parse tree with semantic attachments for the sentence"AyCaramba serves meat." Figure 15.3. from Jurafsky and Martin, p. 554.

  45. Semantics - Lambda Calculus 8 Modifiers can be added into semantic description as part of the grammar rules, by intersection of concepts: Nominal  Adj Nominal {x. Nominal.sem(x) IS-A(x, Adj.sem)} Example: a "cheap restaurant" x. IS-A (x, restaurant) IS-A(x, cheap) Problem if intersection of concepts is misleading, e.g. "former friend". Use "modification" rule instead : Nominal  Adj Nominal {x. Nominal.sem(x) AdjMod(x, Adj.sem)} Use rule for "cheap restaurant": x. IS-A (x, restaurant) AdjMod(x, cheap) where "cheap" modifies the "restaurant" in a specific way.

  46. Semantics - Problems Problems with Modal Verbs: • apply to predicate structure (other verb) • referential opaqueness • not standard implications Example: I think Joe's flight leaves at 7pm. (think (Speaker, leaves (Joe's flight, 7 pm))) Add: Joe's flight is BA727. BA 727 is delayed. Add: I think I go home. Problem: cannot apply predicate to formula in FOPL What does Speaker think now? Should I stay or should I go?

  47. Parsing with Semantic Features Modified Early Algorithm.Figure 15.5, Jurafsky and Martin, p. 570.

  48. References Jurafsky, D. & J. H. Martin, Speech and Language Processing, Prentice-Hall, 2000. (Chapters 9 and 10) Helmreich, S., From Syntax to Semantics, Presentation in the 74.419 Course, November 2003.

More Related