1 / 1

Concurrent Reasoning with Inference Graphs

Concurrent Reasoning with Inference Graphs. Daniel R. Schlegel Stuart C. Shapiro. Department of Computer Science and Engineering. Problem Summary. Inference Graphs. Concurrency and Scheduling. Example:. Extend Propositional Graphs Adds channels for information flow:

faunia
Download Presentation

Concurrent Reasoning with Inference Graphs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Concurrent Reasoning with Inference Graphs Daniel R. Schlegel Stuart C. Shapiro Department of Computer Science and Engineering Problem Summary Inference Graphs Concurrency and Scheduling Example: • Extend Propositional Graphs • Adds channels for information flow: • i-channels report truth of an • antecedent to a rule node. • u-channels report truth of • a consequent from a rule node. • Channels contain valves. • Hold messages back, or allow • them through. • Channels relay messages • I-INFER (“I’ve been inferred”) • U-INFER (“You’ve been inferred”) • BACKWARD-INFER (“Open valves so messages that might infer me can arrive”) • CANCEL-INFER (“Stop inferring me (close valves)”) • UNASSERT (“I’m no longer believed”) • Different message types have different relative priorities (important for scheduling). • The area between two valves is called an inference segment. • When a message passes through a valve: • A task is created with the same priority as the message, and is the application of the inference segment’s function to the message. • The task is added to a queue which puts higher priority tasks towards its head. • A task only operates within a task segment. • tasks for relaying newly derived information using segments to the right are executed before those to the left, and • once a node is known to be true or false, all tasks attempting to derive it (left of it in the graph) are canceled, as long as their results are not needed elsewhere. • There is minimal shared state between tasks, allowing many tasks to operate concurrently. • Rise of multi-core computers, BUT: • Lack of concurrent natural deduction systems. Inference Capabilities Channels represented by dashed lines are i-channels and are drawn from antecedents to rule nodes. Channels represented by dotted lines are u -channels and are drawn from rule nodes to consequents. • Forward, backward, bi-directional, and focused inference. • Retains all derived formulas for later re-use. • Propagates disbelief. Only concurrent inference system with these capabilities. Propositional Graphs • Directed acyclic graph • Every well-formed expression is a node • Individual constants • Functional terms • Atomic formulas • Non-atomic formulas (“rules”) • Each node has an identifier, either • Symbol, or • wfti[!] • No two nodes with same identifier. Rule Node Inference Message arrives at node. Message translated to a RUI, containing positive and negative instances of antecedents contained in the message. New RUI combined with existing ones. Output is a set of new RUIs which are used to decide of the rule can fire. When a rule fires, new messages are sent out. Evaluation • Concurrency: • Near linear performance improvement with the number of processors • Performance resilient to graph depth and branching factor changes. • Scheduling Heuristics: • Backward-inference with or-entailment shows 10x improvement over LIFO queues, and 20-40x over FIFO queues. • See GKR paper (below) for more details. Example: We assume backward inference has been initiated, opening all the valves in the graph. First, in (a), messages about the truth of a, b, and c flow through i-channels to wft1. Since wft1 is and-entailment, each of its antecedentsmust be true for it to fire. Since they are, in (b) the message that d is true flows through wft1’s u-channel. d becomes asserted and reports its new status through its i-channel (c). In (d), wft2 receives this information, and since it is an or-entailment rule and requires only a single antecedent to be true for it to fire, it reports to its consequents that they are now true, and cancels inference in e. Finally, in (e), f is asserted, and inference is complete. Example: Propositional graph for the assertions that if a, b, and c are true, then d is true, and if d or e are true, then f is true. References • Daniel R. Schlegel and Stuart C. Shapiro, Concurrent Reasoning with Inference Graphs. In Proceedings of the Third International IJCAI Workshop on Graph Structures for Knowledge Representation and Reasoning (GKR 2013), 2013, in press. This work has been supported by a Multidisciplinary University Research Initiative (MURI) grant (Number W911NF-09- 1-0392) for Unified Research on Network-based Hard/Soft Information Fusion, issued by the US Army Research Office (ARO) under the program management of Dr. John Lavery.

More Related