1 / 35

Understanding Naturally Conveyed Explanations of Device Behavior

Understanding Naturally Conveyed Explanations of Device Behavior. Michael Oltmans and Randall Davis MIT Artificial Intelligence Lab. Roadmap. The problem Our approach Implementation System architecture How ASSISTANCE interprets descriptions Demonstrating understanding

india
Download Presentation

Understanding Naturally Conveyed Explanations of Device Behavior

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Understanding Naturally Conveyed Explanations of Device Behavior Michael Oltmans and Randall Davis MIT Artificial Intelligence Lab

  2. Roadmap • The problem • Our approach • Implementation • System architecture • How ASSISTANCE interprets descriptions • Demonstrating understanding • Evaluation and contributions • Related and future work

  3. Sketches Models • We have a sketch of a device • A simulation model can be generated from the sketch • Life is good… or is it?

  4. The Problem • No representation of intended behavior • People talk and sketch but the computer doesn’t understand

  5. Task • Understand descriptions of device behavior: • Given: • A model of the device’s structure • A natural explanation of the behavior • Generate a causal model of behavior

  6. Roadmap • The problem • Our approach • Implementation • System architecture • How ASSISTANCE interprets descriptions • Demonstrating understanding • Evaluation and contributions • Related and future work

  7. Naturally Conveyed Explanations • Natural input modalities • Sketched devices • Sketched gestures • Speech • Natural content of descriptions • Causal • Behavioral

  8. Example: Describing the Behavior of a Spring

  9. Example: Describing the Behavior of a Spring

  10. Example: Describing the Behavior of a Spring

  11. Example: Describing the Behavior of a Spring

  12. Sources of power • Conventions in explanations aide interpretation • Description order suggests causal order • Constrained vocabulary • Overlapping descriptions provide constraints on interpretations

  13. Roadmap • The problem • Our approach • Implementation • System architecture • How ASSISTANCE interprets descriptions • Demonstrating understanding • Evaluation and contributions • Related and future work

  14. Sketch Speech • ASSIST • Recognize sketch • ViaVoice™ • Recognize speech • Parse • ASSISTANCE • Interpret explanation • LTRE • Truth Maintenance • Rule System Causal Model and Simulation

  15. Outputs • Consistent causal model • Tree • Nodes are events • Links indicate causal relationships • Demonstration of understanding • Natural language descriptions of causality • Parameter constraints

  16. The Representation of Utterances • Input comes from ViaVoice™ : • Grammar constructed based on observed explanations • Tagged with parts of speech and semantic categories

  17. Representing the parse tree “body 1 pushes body 2” SENTENCE SIMPLE_SENTENCE (… “body 1 pushes body 2” (S0) t1) SUBJECT NOUN NOUN-PHRASE (… “body 1” (S0 t1) t2) VERB_PHRASE (… “pushes body 2” (S0 t1) t3) PROPELS VERB (… “pushes” (S0 t1 t3) t4) DIRECT_OBJECT NOUN NOUN-PHRASE (… “body 2” (S0 t1 t3) t5)

  18. Steps In Interpreting Explanations: • Infer motions from annotations and build event representations • Find causal connections • Search for consistent causal structures • Pick best causal structure

  19. Step 1: Inferring Motions from Annotations • Inputs: • Arrows • Utterances • “moves,” “pushes,” “the spring releases” • Outputs: • (moves body-1 moves-body-1-394) • (describes arrow-2 moves-body-1-394)

  20. Inferring Motion From Arrows • Rule triggers: • Arrow • Arrow referent (i.e. a body) • The body is mobile • Rule body records that: • The body moves • The arrow describes the path

  21. Inferring Motion From Arrows (rule ((:TRUE (arrow ?arrow) :VAR ?f1) (:TRUE (arrow-referent ?arrow ?body) :VAR ?f2) (:TRUE (can-move ?body) :VAR ?f3) (:TRUE (name ?name ?body))) (rlet ((?id (new-id “Moves” ?name))) (rassert! (:implies (:AND ?f1 ?f2 ?f3) (:AND (moves ?body ?id) (describes ?arrow ?id))) :ARROW-IS-MOTION)))

  22. Multi-Modal References • Match a sentence whose subject is “this” and a pointing gesture • Assert the referent as the subject of the sentence • Limitations: • User must point at referent before the utterance • Allow one “this” per utterance

  23. Event 1 (moves body-1 id-1) (moves body-1 id-2) Redundant Events • Redundant explanations lead to multiple move statements for some events • Merge them into a unique event statement “Body 1” falls

  24. Step 2: Find Causal Connections • Plausible causes • Arrow indicating motion near another object • Exogenous forces • Definite causes • “When … then …” utterances • “Body 1 pushes body 2”

  25. Step 3: Search for Consistent Causal Structures • Some events have several possible causes • Find consistent causal chains • Search • Forward looking depth-first-search • Avoids repeating bad choices by recording bad combinations of assumptions

  26. Step 4: Find the Best Interpretation • Filter out interpretations that have unnecessary exogenous causes • Pick the interpretation that most closely matches the explanation order • While there are multiple valid interpretations • Choose one event with multiple possible causes • Assume the causal relation whose cause has the earliest description time

  27. Answer Queries and Adjust Parameters • Queries: • Designer: What is body 2 involved in? • ASSISTANCE: The motion of body 3 causes the motion of body 2 which causes the motion of body 5 • Parameter Adjustment • Set spring length

  28. Roadmap • The problem • Our approach • Implementation • System architecture • How ASSISTANCE interprets descriptions • Demonstrating understanding • Evaluation and contributions • Related and future work

  29. Limitations of the Implementation • Scope of applicability restricted • State transitions are one step deep • Cannot handle conjunctions of causes • Limited knowledge about common device patterns • Latches, linkages, etc… • Supports and prevents • Natural language limitations • Use a full featured NL system like START • Formally determine the grammar

  30. Evaluation of the Approach • Advantages • Focus on behavior in accordance with survey results • Move away from rigidity of WIMP interfaces • Similar to person-to-person interaction • Alternatives • More dialog and feedback • Natural vs. efficient • Open claim that the domain is adequately constrained

  31. Contributions • Understanding naturally conveyed descriptions of behavior • Generating representations of device behavior • Match the designer’s explanation • Generate simple explanations of causality • Allow the calculation of simulation parameters

  32. Related Work • Understanding device sketches • Alvarado 2000 • Multimodal interfaces • Oviatt and Cohen • Causality • C. Rieger and M. Grinberg 1977

  33. Future Work • Direct manipulation • Dialog • Expand natural language capabilities • Smart design tools

More Related