1 / 92

Working with Discourse Representation Theory Patrick Blackburn & Johan Bos Lecture 5 Applying DRT

Working with Discourse Representation Theory Patrick Blackburn & Johan Bos Lecture 5 Applying DRT. Today. Given what we know about DRT, both from a theoretical and practical perspective, can we use it for practical applications?. Outline. Spoken dialogue system with DRT

jovita
Download Presentation

Working with Discourse Representation Theory Patrick Blackburn & Johan Bos Lecture 5 Applying DRT

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Working with Discourse Representation TheoryPatrick Blackburn & Johan BosLecture 5Applying DRT

  2. Today • Given what we know about DRT, both from a theoretical and practical perspective, can we use it for practical applications?

  3. Outline • Spoken dialogue system with DRT • Using DRT and inference to control a mobile robot • Wide coverage parsing with DRT • Recognising Textual Entailment

  4. Human-Computer Dialogue • Focus on small domains • Grammatical coverage ensured • Background knowledge encoding • Spoken Dialogue system • Godot the robot • Speech recognition and synthesis • People could give Godot directions, ask it questions, tell it new information • Godot was a REAL robot

  5. Godot the Robot Godot with Tetsushi Oka

  6. Typical conversation with Godot • Simple dialogues in beginning • Human: Robot? • Robot: Yes? • Human: Where are you? • Robot: I am in the hallway. • Human: OK. Go to the rest room!

  7. Adding DRT to the robot • The language model that the robot used for speech recognition returned DRSs • We used the model builder MACE and the theorem prover SPASS for inference • The model produced by MACE was used to find out what the robot should do • This was possible as MACE produces minimal models • Of course we also checked for consistency and informativity

  8. Advanced conversation with Godot • Dealing with inconsistency and informativity • Human: Robot, where are you? • Robot: I am in the hallway. • Human: You are in my office. • Robot: No, that is not true. • Human: You are in the hallway. • Robot: Yes I know!

  9. Videos of Godot Video 1: Godot in the basement of Bucceuch Place Video 2: Screenshot of dialogue manager with DRSs and camera view of Godot

  10. Minimal Models • Model builders normally generate models by iteration over the domain size • As a side-effect, the output is a model with a minimal domain size • From a linguistic point of view, this is interesting, as there is no redundant information

  11. Using models • Examples:Turn on a light.Turn on every light.Turn on everything except the radio. Turn off the red light or the blue light.Turn on another light.

  12. Adding presupposition • Godot was connected to an automated home environment • One day, I asked Godot to switch on all the lights • However, Godot refused to do this, responding that it was unable to do so. • Why was that? • At first I thought that the theorem prover made a mistake. • But it turned out that one of the lights was already on.

  13. Intermediate Accommodation • Because I had coded to switch on X having a precondition that X is not on, the theorem prover found a proof. • Coding this as a presupposition, would not give an inconsistency, but a beautiful case of intermediate accommodation. • In other words: • Switch on all the lights![ All lights are off; switch them on.][=Switch on all the lights that are currently off]

  14. Sketch of resolution

  15. Global Accommodation

  16. Intermediate Accommodation

  17. Local Accommodation

  18. Outline • Spoken dialogue system with DRT • Using DRT and inference to control a mobile robot • Wide coverage parsing with DRT • Recognising Textual Entailment

  19. Wide-coverage DRT • Nowadays we have robust wide-coverage parsers that use stochastic methods for producing a parse tree • Trained on Penn Tree Bank • Examples are parsers like those from Collins and Charniak

  20. Wide-coverage parsers • Say we wished to produce DRSs on the output of these parsers • We would need quite detailed syntax derivations • Closer inspection reveals that many of the parsers use many [several thousands] phrase structure rules • Often, long distance dependencies are not recovered

  21. Combinatory Categorial Grammar • CCG is a lexicalised theory of grammar (Steedman 2001) • Deals with complex cases of coordination and long-distance dependencies • Lexicalised, hence easy to implement • English wide-coverage grammar • Fast robust parser available

  22. Categorial Grammar • Lexicalised theory of syntax • Many different lexical categories • Few grammar rules • Finite set of categories defined over a base of core categories • Core categories: s np n pp • Combined categories: np/n s\np (s\np)/np

  23. CCG: type-driven lexicalised grammar

  24. CCG: combinatorial rules • Forward Application (FA) • Backward Application (BA) • Generalised Forward Composition (FC) • Backward Crossed Composition (BC) • Type Raising (TR) • Coordination

  25. CCG derivation NP/N:aN:spokesmanS\NP:lied

  26. CCG derivation NP/N:a N:spokesman S\NP:lied

  27. CCG derivation NP/N:a N:spokesman S\NP:lied ------------------------------- (FA)

  28. CCG derivation NP/N:aN:spokesman S\NP:lied ------------------------------- (FA) NP: a spokesman

  29. CCG derivation NP/N:a N:spokesman S\NP:lied ------------------------------- (FA) NP: a spokesman ---------------------------------------- (BA)

  30. CCG derivation NP/N:a N:spokesmanS\NP:lied ------------------------------- (FA) NP: a spokesman ---------------------------------------- (BA) S: a spokesman lied

  31. CCG derivation NP/N:a N:spokesman S\NP:lied ------------------------------- (FA) NP: a spokesman ---------------------------------------- (BA) S: a spokesman lied

  32. Coordination in CCG np:Artie(s\np)/np:likes(x\x)/x:and np:Tony(s\np)/np:hates np:beans ---------------- (TR) ---------------- (TR) s/(s\np):Artie s/(s\np):Tony ------------------------------------ (FC)--------------------------------------- (FC) s/np: Artie likes s/np:Tony hates ------------------------------------------------------- (FA) (s/np)\(s/np):and Tony hates --------------------------------------------------------------------------------- (BA) s/np: Artie likes and Tony hates ------------------------------------------------------ (FA) s: Artie likes and Tony hates beans

  33. The Glue • Use the Lambda Calculus to combine CCG with DRT • Each lexical entry gets a DRS with lambda-bound variables, representing the “missing” information • Each combinatorial rule in CCG gets a semantic interpretation, again using the tools of the lambda calculus

  34. Interpreting Combinatorial Rules • Each combinatorial rule in CCG is expressed in terms of the lambda calculus: • Forward Application:FA(,) = @ • Backward Application:BA(,) = @ • Type Raising:TR() = x.x@ • Function Composition:FC(,) = x.@x@

  35. CCG: lexical semantics

  36. CCG derivation NP/N:a N:spokesman S\NP:lied p. q. ;p@x;q@x z. x.x@y.

  37. CCG derivation NP/N:a N:spokesman S\NP:lied p. q. ;p@x;q@x z. x.x@y. ------------------------------------------------ (FA) NP: a spokesman p. q. ;p@x;q@x@z.

  38. CCG derivation NP/N:a N:spokesman S\NP:lied p. q. ;p@x;q@x z. x.x@y. -------------------------------------------------------- (FA) NP: a spokesman q. ; ;q@x

  39. CCG derivation NP/N:a N:spokesman S\NP:lied p. q. ;p@x;q@x z. x.x@y. -------------------------------------------------------- (FA) NP: a spokesman q. ;q@x

  40. CCG derivation NP/N:a N:spokesman S\NP:lied p. q. ;p@x;q@x z. x.x@y. -------------------------------------------------------- (FA) NP: a spokesman q. ;q@x -------------------------------------------------------------------------------- (BA) S: a spokesman lied x.x@y. @q. ;q@x

  41. CCG derivation NP/N:a N:spokesman S\NP:lied p. q. ;p@x;q@x z. x.x@y. -------------------------------------------------------- (FA) NP: a spokesman q. ;q@x -------------------------------------------------------------------------------- (BA) S: a spokesman lied q. ;q@x @ y.

  42. CCG derivation NP/N:a N:spokesman S\NP:lied p. q. ;p@x;q@x z. x.x@y. -------------------------------------------------------- (FA) NP: a spokesman q. ;q@x -------------------------------------------------------------------------------- (BA) S: a spokesman lied ;

  43. CCG derivation NP/N:a N:spokesman S\NP:lied p. q. ;p@x;q@x z. x.x@y. -------------------------------------------------------- (FA) NP: a spokesman q. ;q@x -------------------------------------------------------------------------------- (BA) S: a spokesman lied

  44. The Clark & Curran Parser • Use standard statistical techniques • Robust wide-coverage parser • Clark & Curran (ACL 2004) • Grammar derived from CCGbank • 409 different categories • Hockenmaier & Steedman (ACL 2002) • Results: 96% coverage WSJ • Bos et al. (COLING 2004) • Example output:

  45. Applications • Has been used for different kind of applications • Question Answering • Recognising Textual Entailment

  46. Recognising Textual Entailment • A task for NLP systems to recognise entailment between two (short) texts • Introduced in 2004/2005 as part of the PASCAL Network of Excellence • Proved to be a difficult, but popular task. • Pascal provided a development and test set of several hundred examples

  47. RTE Example (entailment) RTE 1977 (TRUE) His family has steadfastly denied the charges. ----------------------------------------------------- The charges were denied by his family.

  48. RTE Example (no entailment) RTE 2030 (FALSE) Lyon is actually the gastronomical capital of France. ----------------------------------------------------- Lyon is the capital of France.

  49. Aristotle’s Syllogisms ARISTOTLE 1 (TRUE) All men are mortal. Socrates is a man. ------------------------------- Socrates is mortal.

  50. How to deal with RTE • There are several methods • We will look at five of them to see how difficult RTE actually is

More Related