1 / 90

Representing Meaning

Representing Meaning. Lecture 18 12 Sep 2007. Transition. First we did words (morphology) Then simple sequences of words Then we looked at true syntax Now we’re moving on to meaning. Where some would say we should have started to begin with. Meaning.

gerd
Download Presentation

Representing Meaning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Representing Meaning Lecture 18 12 Sep 2007

  2. Transition • First we did words (morphology) • Then simple sequences of words • Then we looked at true syntax • Now we’re moving on to meaning. Where some would say we should have started to begin with.

  3. Meaning • Language is useful and amazing because it allows us to encode/decode… • Descriptions of the world • What we’re thinking • What we think about what other people think • Don’t be fooled by how natural and easy it is… In particular, you never really… • Utter word strings that match the world • Say what you’re thinking • Say what you think about what other people think

  4. Meaning • You’re simply uttering linear sequences of words such that when other people read/hear and understand them they come to know what you think of the world.

  5. Meaning Representations • We’re going to take the same basic approach to meaning that we took to syntax and morphology • We’re going to create representations of linguistic inputs that capture the meanings of those inputs. • But unlike parse trees and the like these representations aren’t primarily descriptions of the structure of the inputs… In most cases, meaning representations are simultaneously descriptions of the meanings of utterances and of some potential state of affairs in some world.

  6. Introduction • Meaning representation languages: capturing the meaning of linguistic utterances using formal notation so that they make semantic processing possible • Example: deciding what to order at a restaurant by reading a menu, giving advice about where to go for dinner • Requires knowledge about food, its preparation, what people like to eat and what restaurants are like • Example: answering a question on an exam • Requires background knowledge about the topic of the question • Example: Learning to use a software by reading a manual • Requires knowledge about current computers, the specific software, similar software applications, knowledge about users in general.

  7. Having Haver Had-thing Speaker Car Semantic Analysis • Semantic analysis: mapping between language and real life • I have a car: 1. First Order Logic ∃x,y: Having(x) ^ Haver(speaker,x) ^ HadThing(y,x) ^ Car(y) 2. Semantic Network 4. Frame Based Representation Having Haver: Speaker HadThing: Car 3. Conceptual Dependency Diagram Car POSS-BY Speaker

  8. Semantic analysis • A meaning representation consists of structures composed from a set of symbols, or representational vocabulary.

  9. Why meaning representations are needed? • What they should do for us? • Example: Giving advice about restaurants to tourists. A computer system that accepts spoken language queries from tourists and constructs appropriate responses by using a knowledge base of relevant domain knowledge. Representations that • Permit us to reason about their truth (relationship to some world) • Permit us to answer questions based on their content • Permit us to perform inference (answer questions and determine the truth of things we don’t actually know)

  10. Semantic Processing • Touchstone application is often question answering • Can a machine answer questions involving the meaning of some text or discourse? • What kind of representations do we need to mechanize that process?

  11. Verifiability Verifiability: Ability to compare the state of affairs described by a representation to the state of affairs in some world modeled in a knowledge base. • Example: Does Anarkali serve vegetarian food? • Knowledge base (KB) • Sample entry in KB: Serves(Anarkali,Vegetarian Food) • Convert question to logical form and verify its truth value against the knowledge base

  12. Unambiguousness • Example:I want to eat someplace near Chowringhee.(multiple interpretations) • Interpretation is important • Preferred interpretations • Regardless of ambiguity in the input, it is critical that a meaning representation language support representations that have a single unambiguous interpretation.

  13. Vagueness • Vagueness: I want to eat Italian food.- what particular food? • Meaning representation language must support some vagueness

  14. Canonical form Inputs that have the same meaning should have the same meaning representation. Distinct sentences having the same meaning • Does Anarkali have vegetarian dishes? • Do they have vegetarian food at Anarkali? • Are vegetarian dishes served at Anarkali? • Does Anarkali serve vegetarian fare? • Words have different senses, multiple words may have the same sense • Having vs. serving • Food vs. fare vs. dishes (each is ambiguous but one sense of each matches the others) • Alternative syntactic analyses have related meaning (Ex: active vs passive)

  15. Inference and variables; expressiveness • Inference and variables: • Can vegetarians eat at Anarkali? • I’d like to find a restaurant that serves vegetarian food. • Serves (x,VegetarianFood) • System’s ability to draw valid conclusions based on the meaning representations of inputs and its store of background knowledge. • Expressiveness: • system must be able to handle a wide range of subject matter

  16. Semantic Processing • We’re going to discuss 2 ways to attack this problem (just as we did with parsing) • There’s the theoretically motivated correct and complete approach… • Computational/Compositional Semantics • And there are practical approaches that have some hope of being useful and successful. • Information extraction

  17. Meaning Structure of Language • The various methods by which human languages convey meaning • Form-meaning associations • Word-order regularities • Tense systems • Conjunctions • Quantifiers • A fundamental predicate-argument structure • Asserts that specific relationships / dependencies hold among the concepts underlying the constituent words and phrases • The underlying structure permits the creation of a single composite meaning representation from the meanings of the various parts.

  18. Predicate-argument structure Sentences Syntactic argument frames I want Italian food.NP want NP I want to spend less than five dollars.NP want Inf-VP I want it to be close by here.NP want NP Inf-VP The syntactic frames specify the number, position and syntactic category of the arguments that are expected to accompany a verb. • Thematic roles: e.g. entity doing the wanting vs. entity that is wanted (linking surface arguments with the semantic=case roles) • Syntactic selection restrictions: I found to fly to Dallas. • Semantic selection restrictions: The risotto wanted to spend less than ten dollars. • Make a reservation for this evening for a table for two persons at eight: Reservation (Hearer,Today,8PM,2)

  19. Any useful meaning representation language must be organized in a way that supports the specification of semantic predicate-argument structures. • Variable arity predicate-argument structures • The semantic labeling of arguments to predicates • The statement of semantic constraints on the fillers of argument roles

  20. Model-theoretic semantics • Basic notions shared by representation schemes Ability to represent • Objects • Properties of objects • Relations among objects • A model is a formal construct that stands for the particular state of affairs in the world that we are trying to represent. • Expressions in a meaning representation language will be mapped in a systematic way to the elements of the model.

  21. Vocabulary of a meaning representation language • Non-logical vocabulary: open-ended set of names for the objects, properties and relations (may appear as predicates, nodes, labels on links, labels in slots in frames, etc) • Logical vocabulary: closed set of symbols, operators, quantifiers, links, etc.provide the formal meaning for composing expressions • Each element of non-logical vocabulary must have a denotation in the model. • domain of a model: set of objects that are part of the application • Capture properties of objects by a set (of domain elements having the property) • Relations denote sets of tuples of elements on the domain

  22. Interpretation: a mapping that maps from the non-logical vocabulary of our meaning representation to the corresponding denotations in the model.

  23. Representational Schemes • We’re going to make use of First Order Predicate Calculus (FOPC) as our representational framework • Not because we think it’s perfect • All the alternatives turn out to be either too limiting or • They turn out to be notational variants

  24. FOPC • Allows for… • The analysis of truth conditions • Allows us to answer yes/no questions • Supports the use of variables • Allows us to answer questions through the use of variable binding • Supports inference • Allows us to answer questions that go beyond what we know explicitly

  25. FOPC • This choice isn’t completely arbitrary or driven by the needs of practical applications • FOPC reflects the semantics of natural languages because it was designed that way by human beings • In particular…

  26. First-order predicate calculus (FOPC) • Formula  AtomicFormula | Formula Connective Formula | Quantifier Variable … Formula | ¬ Formula | (Formula) • AtomicFormula  Predicate (Term…) • Term  Function (Term…) | Constant | Variable • Connective  ∧| ⋁ | ⇒ • Quantifier  ∀ | ∃ • Constant  A | VegetarianFood | Anarkali • Variable  x | y | … • Predicate  Serves | Near | … • Function  LocationOf | CuisineOf | …

  27. Example • I only have five dollars and I don’t have a lot of time. • Have(Speaker,FiveDollars) ∧¬Have(Speaker,LotOfTime) • variables: • Have(x,FiveDollars) ∧¬Have(x,LotOfTime) • Note: grammar is recursive

  28. Semantics of FOPC • FOPC sentences can be assigned a value of true or false. • Anarkali is near RC. • Near (LocationOf (Anarkali), LocationOf (RC))

  29. Inference • Modus ponens:⇒ • Example:VegetarianRestaurant(Joe’s) x: VegetarianRestaurant(x) ⇒ Serves(x,VegetarianFood)Serves(Joe’s,VegetarianFood)

  30. Uses of modus ponens • Forward chaining: as individual facts are added to the database, all derived inferences are generated • Backward chaining: starts from queries. Example: the Prolog programming language • father(X, Y) :- parent(X, Y), male(X).parent(john, bill).parent(jane, bill).female(jane).male (john).?- father(M, bill).

  31. Variables and quantifiers • A restaurant that serves Mexican food near UM. • ∃ x: Restaurant(x) ∧ Serves(x,MexicanFood)∧ Near(LocationOf(x),LocationOf(UM)) • All vegetarian restaurants serve vegetarian food. •  x: VegetarianRestaurant(x) ⇒Serves (x,VegetarianFood) • If this sentence is true, it is also true for any substitution of x. However, if the condition is false, the sentence is always true.

  32. Meaning Structure of Language • The semantics of human languages… • Display a basic predicate-argument structure • Make use of variables • Make use of quantifiers • Use a partially compositional semantics

  33. Predicate-Argument Structure • Events, actions and relationships can be captured with representations that consist of predicates and arguments to those predicates. • Languages display a division of labor where some words and constituents function as predicates and some as arguments.

  34. Predicate-Argument Structure • Predicates • Primarily Verbs, VPs, PPs, Sentences • Sometimes Nouns and NPs • Arguments • Primarily Nouns, Nominals, NPs, PPs • But also everything else; as we’ll see it depends on the context

  35. Example • Mary gave a list to John. • Giving(Mary, John, List) • More precisely • Gave conveys a three-argument predicate • The first arg is the subject • The second is the recipient, which is conveyed by the NP in the PP • The third argument is the thing given, conveyed by the direct object

  36. Not exactly • The statement • The first arg is the subject can’t be right. • Subjects can’t be givers. • We mean that the meaning underlying the subject phrase plays the role of the giver.

  37. Better • Turns out this representation isn’t quite as useful as it could be. • Giving(Mary, John, List) • Better would be

  38. Predicates • The notion of a predicate just got more complicated… • In this example, think of the verb/VP providing a template like the following • The semantics of the NPs and the PPs in the sentence plug into the slots provided in the template

  39. Compositional Semantics • Compositional Semantics • Syntax-driven methods of assigning semantics to sentences

  40. Semantic Analysis • Semantic analysis is the process of taking in some linguistic input and assigning a meaning representation to it. • There a lot of different ways to do this that make more or less (or no) use of syntax • We’re going to start with the idea that syntax does matter • The compositional rule-to-rule approach

  41. Semantic Processing • We’re going to discuss 2 ways to attack this problem (just as we did with parsing) • There’s the theoretically motivated correct and complete approach… • Computational/Compositional SemanticsCreate a FOL representation that accounts for all the entities, roles and relations present in a sentence. • And there are practical approaches that have some hope of being useful and successful. • Information extractionDo a superficial analysis that pulls out only the entities, relations and roles that are of interest to the consuming application.

  42. Compositional Analysis • Principle of Compositionality • The meaning of a whole is derived from the meanings of the parts • What parts? • The constituents of the syntactic parse of the input • What could it mean for a part to have a meaning?

  43. Example • AyCaramba serves meat

  44. Compositional Analysis

  45. Augmented Rules • We’ll accomplish this by attaching semantic formation rules to our syntactic CFG rules • Abstractly • This should be read as the semantics we attach to A can be computed from some function applied to the semantics of A’s parts.

  46. Easy parts… NP -> PropNoun NP -> MassNoun PropNoun -> AyCaramba MassMoun -> meat Attachments {PropNoun.sem} {MassNoun.sem} {AyCaramba} {MEAT} Example

  47. S -> NP VP VP -> Verb NP Verb -> serves {VP.sem(NP.sem)} {Verb.sem(NP.sem) ??? Example

  48. Lambda Forms • A simple addition to FOPC • Take a FOPC sentence with variables in it that are to be bound. • Allow those variables to be bound by treating the lambda form as a function with formal arguments

  49. Example

  50. Example

More Related