1 / 33

Scripts:

Scripts:. Common Sense Story Understanding. Motivations. So far, you have used FOL, And/Or Graphs, production rules, semantic nets to represent knowledge.

jarah
Download Presentation

Scripts:

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Scripts: Common Sense Story Understanding

  2. Motivations • So far, you have used FOL, And/Or Graphs, production rules, semantic nets to represent knowledge. • Try to represent your behaviors and the reasoning behind your behaviors this morning from the time to got up to the time you left your place. • These AI technologies run into their limits when we want to represent common sense knowledge and reasoning. • Semantic nets that you learned last can do some limited common sense reasoning. • In this lecture, you will learn also scripts to strengthen your arsenal of common sense knowledge representation.

  3. Objectives • Common sense assumptions • Conceptual dependency theory • Restaurant script • Story understanders

  4. A really short story Sue went out to lunch. She sat at a table and called a waitress, who brought her a menu. She ordered a sandwich. • Why did the waitress bring a menu to Sue? • Who was the “she” who ordered a sandwich? • Who paid? • It is easy for us to answer these question because we knew many assumptions not explicitly mentioned in the story. • How to get a computer to do the same thing? • How to represent the daily common sense assumptions that we know?

  5. Basic idea of common sense • Text: Vincent loves Mia. • Simple predicate: loves(vincent,mia) • Representation: • FOL: xy(vincent(x) & mia(y) & love(x,y)) • Common sense assumptions: x (vincent(x)  man(x)) x (mia(x)  woman(x)) x (man(x)   woman(x))

  6. Texts and Ambiguity • Usually, ambiguities cause many possible interpretations • Example:Butch walks into his modest kitchen. He opens the refrigerator. He takes out a milk and drinks it.

  7. Texts and Ambiguity • Usually, ambiguities cause many possible interpretations • Example:Butch walks into his modest kitchen. He opens the refrigerator. He takes out a milk and drinks it.

  8. Texts and Ambiguity • Usually, ambiguities cause many possible interpretations • Example:Butch walks into his modest kitchen. He opens the refrigerator. He takes out a milk and drinks it.

  9. Texts and Ambiguity • Usually, ambiguities cause many possible interpretations • Example:Butch walks into his modest kitchen. He opens the refrigerator. He takes out a milk and drinks it.

  10. Consistency checking • Inconsistent text: • Mia likes Vincent. • She does not like him. • Two interpretations, only one consistent: • Mia likes Jody. • She does not like her. • Who does not like whom? • Jody does not like Mia.

  11. Endow a computer with common sense • How do we get the computer to • disambiguate a sentence? • sort out inconsistencies? • know common sense? • One attempt is to standardize the semantic network for the English language. • Verb-oriented approach and concept dependency theory are such attempts. • They parse a sentence by focusing on the verb.

  12. Verb-oriented approach • Single out the main verb (action) of the sentence. • This is the central node of the net. • Links at this node are related to one of the 5 cases: • agent • object • instrument • location • time Case frame representation of the sentence “Sarah fixed the chair with glue.”

  13. Concept dependency theory •  Arrow indicates direction of dependency •  Double arrow indicates agent-verb relationship • p = Past tense • o = Object case relation • R = Recipient case relation “John throws the ball” This conceptual dependency graph is stored in the computer. It represents the canonical form for the semantic "John throws the ball". The original sentence could have been written in English, Chinese, etc.

  14. 4 basic syntactic units In conceptual dependency theory, there are 4 basic syntactic units, independent of the natural language. • ACT • action, verb • PP, picture producer • name, noun, pronoun • AA, action aider • modifiers of actions, adverbs • PA, picture aider • modifiers of objects, adjectives

  15. Some primitive ACTs Primitive ACTs represent all basic actions. • ATRANS transfer a relationship give • PTRANS transfer a physical location of an object go • PROPEL apply physical force to an object push • MOVE move body part by owner kick • GRASP grab an object by an actor grasp • INGEST ingest an object by an animal eat • EXPEL expel from an animal’s body cry • MTRANS transfer mental information tell • MBUILD mentally make new information decide • CONC conceptualize or think about an idea think • SPEAK produce sound say • ATTEND focus sense organ listen

  16. + + “John ate the egg” primitive act direction of dependency past tense object relation agent-verbrelationship direction of object within action This act consists 2 sub-acts.

  17. Conceptual dependency graphs ACT PP PA PP

  18. “John prevented Mary from giving a book to Bill” past tense: prevented John causes Mary … conditional / negation direct object indirect object past tense: gave recipient Bill

  19. Summary • Semantic networks can be used to represent meanings. • Conceptual dependency graphs can be used to standardize the meaning of sentences. • A set of these related graphs can be used to understand simple stories (screen plays). • Scripts technology is next. …

  20. Answer questions about a story John went to a restaurant, The hostess seated him. She gave him a menu. The waiter came to the table. John ordered a lobster. He was served quickly, left a large tip and the restaurant. Q: What did John eat? Lobster. Q: Who gave John the menu? The hostess. Q: Who gave John the lobster? The waiter. Q: Who paid the check? John. Q: What happened when John went to the table? The hostess gave him a menu and John sat down. Q: Why did John get a menu? So he could order. Q: Why did John give the waiter a large tip? Because he was served quickly. Q: How much time did John spend in the restaurant, 5 minutes? half an hour? an hour? 5 hours?

  21. Restaurant script Sue went out to lunch. She sat at a table and called a waitress, who brought her a menu. She ordered a sandwich. • Why did the waitress bring a menu to Sue? • Who was the “she” who ordered a sandwich? • Who paid? • People organize background knowledge into structures that correspond to typical situations (scripts) • Script: A typical scenario of what happens in… • a restaurant • a soccer game • a classroom • the morning: get up, eat breakfast, go to school

  22. Components of scripts • Entry conditions, pre-conditions • Facts that must be true to call the script • An open restaurant, a hungry customer that has some money • Results, post-conditions • Facts that will be true after the script has terminated • Customer is full and has less money; restaurant owner has more money

  23. Components of scripts cont' • Props • Typical things that support the content of the script • waiters, tables, menus • Roles • Actions that participants perform • Represented using conceptual dependency • Waiter takes orders, delivers food, presents bill • Scenes • A temporal aspect of the script • Entering the restaurant, ordering, eating, …

  24. Scene 1: Enter customer • Script: restaurant • Roles: customer (S), waiter, chef, cashier • Reason: to get food so as to up in pleasure and down in hunger • Scene1: entering • S PTRANS S into restaurant • S ATTEND eyes to where empty tables are • S MBUILD mentally decides where to sit • S PTRANS S to table • S MOVE S to sit down

  25. Scene 2: Ordering (W brings menu) S

  26. Last 2 scenes • Scene3: eating • Cook ATRANS Food to Waiter • Waiter ATRANS F to S • S INGEST Food • Scene4: exiting • W write restaurant bill • W PTRANS W to S • W ATRANS bill to S • S ATRANS tip to waiter • S PTRANS S to cashier • S ATRANS money to cashier • S PTRANS S out of restaurant

  27. Prolog implementation Sue went out to lunch. She sat at a table and called a waitress, who brought her a menu. She ordered a sandwich. • Invoke (call) the Restaurant script • Check entry conditions • Unify {S / Sue} • Assume that (typically) Sue is hungry and Sue has money • Unify people and things in the story with the roles and props in the script • {W / waitress, F / sandwich}

  28. Queries • Why did the waitress bring a menu to Sue? • Because S MTRANS “need menu” to W … • Sue tells “need menu” to waitress • Who was the “she” who ordered a sandwich? • S MTRANS “I want F” to W • Sue tells “I want a sandwich” to the waitress • Who paid? • S ATRANS money to M … • Sue gives money to the cashier

  29. SAM John went to a restaurant last night. He ordered steak. When he paid he noticed he was running out of money. He hurried home since it had started to rain. • SAM (Script Applier Mechanism) reads in the above story. • Parses it into an internal conceptual dependency representation. • Binds the people and things in the story to roles and props in the script. • Use default to fill in any missing info. • Then answer these questions: • Did John eat dinner last night? • How could John get a menu? • What did John buy? • Did John use cash or a credit card?

  30. Successful applications • SAM has progressed from reading simple made-up stories to newspaper reports about vehicle accidents, visiting dignitaries and several other knowledge domains. • SAM demonstrates its comprehension of a story by summarizing or paraphrasing it, and by answering questions about it. • Database queries • Chat within special domains: football, stock market, etc.

  31. Scripts are not so flexible Melissa was eating dinner at her favorite restaurant when a large piece of plaster fell from the ceiling and landed on her date. She then heard some more gun shots. • Was Melissa eating a date salad? • Was Melissa's date plastered? • What did she do next? • Common sense reasoning is extremely difficult for computers.

  32. Problems with CDGs and scripts • Knowledge must be decomposed into fairly low level primitives. • Primitive acts are not necessarily what humans do. • Impossible or difficult to find correct set of primitives. • Can't produce them automatically from natural language. • Scripts needs to be built by hand. • Instability: minor changes, such as misspelling, in the system, cause drastic downgrade in performance • No learning systems

  33. Conclusion • Conceptual dependency graphs extends semantic nets by standardizing some verbs of the English language. • These primitive actions are used in the context of a scripted daily situation. • Common sense representation and reasoning is extremely difficult for computers. • Some success has been achieved using conceptual dependency theory and scripts.

More Related