1 / 30

Local and Distributed Defeasible Reasoning in Multi-Context Systems

This paper presents the concept of context and contextual reasoning in artificial intelligence, specifically in the domain of multi-context systems. It explores the formalizations of context, the properties of multi-context systems, and proposes a methodology for reasoning in such systems.

rodriquezt
Download Presentation

Local and Distributed Defeasible Reasoning in Multi-Context Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Local and Distributed Defeasible Reasoning in Multi-Context Systems Antonis Bikakis, Grigoris Antoniou Institute of Computer Science, FO.R.T.H., Greece October, 2008

  2. Overview • Motivation - Background • Context and Contextual Reasoning • Multi-Context Systems • Reasoning in MCS • Methodology • Representation Model • P2P_DRdl Algorithm • Properties • Conclusion - Discussion

  3. Context in Artificial Intelligence • Context and contextual reasoning • were first introduced in AI by McCarthy as an approach for the problem of generality • have already been used in various distributed reasoning applications • e.g. CYC, Semantic Web (contextualized ontologies), multi-agent systems, commonsense reasoning, reasoning with viewpoints,… • are expected to play a significant role in the development of the next generation AI applications • e.g. Ambient Intelligence

  4. Formalizations of context • Dimensions of contextual reasoning • Partiality • Approximation • Perspective • Main Formalizations • Propositional Logic of Context (PLC) • Multi-Context Systems (MCS) • associated with the Local Model Semantics • MCS has been argued to be • most adequate wrt. to the three dimensions of contextual reasoning • technically more general than PLC

  5. Multi-Context Systems: The magic box example • None of the observers can make out the depth of the box Mr. 2 Mr. 1

  6. Multi-Context Systems: The magic box example • None of the observers can make out the depth of the box • Mr. 1’s beliefs may regard concepts that are meaningless for Mr.2 and vice versa central section Mr. 2 Mr. 1 ?

  7. Multi-Context Systems: The magic box example • None of the observers can make out the depth of the box • Mr. 1’s beliefs may regard concepts that are meaningless for Mr.2 and vice versa • Mr. 1 and Mr. 2 may use common concepts but interpret them in different ways right section Mr. 2 Mr. 1 right section

  8. Multi-Context Systems: The magic box example • None of the observers can make out the depth of the box • Mr. 1’s beliefs may regard concepts that are meaningless for Mr.2 and vice versa • Mr. 1 and Mr. 2 may use common concepts but interpret them in different ways • The observers may have partial access to each other’s beliefs about the box. ball in the left section Mr. 2 Mr. 1 ball in the right section

  9. Multi-Context Systems: Intuitions • A context is a partial and approximate theory of the world from some individual’s perspective. • Reasoning with multiple contexts is a combination of • local reasoning (reasoning with the local context only) • distributed reasoning (reasoning that takes into account the relations between local contexts) • No context can be fully translated into another – the relationship between different contexts can be described only to partial extend

  10. Multi-Context Systems: Model • A context can be thought of as a a set of axioms and inference rules that model local context knowledge. • Relations between local contexts are modeled as inference rules (known as mapping or bridge rules) with premises and consequences in different contexts, which enable information flow between related contexts. • Different contexts are expected to use • different languages • different inference systems • Challenges • Heterogeneity • Inconsistencies that arise from the interaction of contexts through mappings

  11. Global Inconsistency in MCS • Even if we assume that contexts A, B and C are locally consistent, their unification through the mappings of A, which import context knowledge from B and C may contain inconsistencies. Context A k ¬k Context C Context B

  12. Non-Monotonic MCS • Non-monotonic rule-based MCS framework [Roelofsen & Serafini, IJCAI 2005] • supports default negation in the mapping rules • handles missing context • Multi-context variant of Default Logic (Contextual Default Logic) [Brewka et al., IJCAI 2007] • models bridge relations as default rules • handles missing context and ambiguous context • can be easily implemented due to the well-studied relation between Default Logic and Logic Programming • But, Contextual Default Logic • does not include the notion of preference or priority between contexts • does not actually resolve the conflicts • may impose a too heavy computational overhead

  13. Overview of our approach • Extend the MCS model with • defeasible local theories • defeasible mapping rules • Resolve • local conflicts using the rule priority relation of Defeasible Logic • global conflicts using • the rule priority relation • an additional preference relation on the set of contexts

  14. Representation Model • A MCSPis a collection of local rule theoriesPi P = {Pi}, i=1,2,…,n • Each context has a proper distinct vocabulary Viand a unique identifier i. • Each local theory is a set of rules that contain only local literals • Strict local rules express sound – definite knowledge ril : ai1 , ai2 ,…, ain-1→ ain • Local rules with empty body express local factual knowledge • Defeasible local rules express uncertain local knowledge rid : bi1 , bi2 ,…, bin-1 bin

  15. Representation Model • Each context also defines mappings that associate literals from its own vocabulary (local literals) with literals from the vocabulary of other contexts (foreign literals) • Mappings are also modeled as defeasible rules rim : ai1 , aj2 ,…, akn-1 aln • Each context defines an acyclic priority relation on the set of its local and mapping rules ri > rj • Each context Pi defines a preference relation Ti on the set of contexts, which expresses the trust that Pi has in the other contexts Ti= [Pk , Pl ,…, Pn] • Pk is preferred to Pl by Pi, if Pk precedes Pl in Ti • Contexts that are not included in Tiare equally preferred by Pi, but less preferred than those that are part of the list

  16. Distributed Query Evaluation Problem • Problem Description • Given a MCS P and a query about a literal xi issued to Pi, find the truth value of xi considering Pi’s local theory, its mappings and the other context theories.

  17. Local & Global Inconsistency • Local conflicts arise as a result of local competing rules (r23, r24) • For conflict resolution we use the rule priority relation • Global conflicts arise are caused by mappings (r12, r13) • For conflict resolution we use both rule priority and context preference relations P2 r21: c2 -> a2 r22: b2 -> a2 r23: b5 => b2 r24: b6 => b2 P2 r21: c2 -> a2 r22: b2 -> a2 r23: b5 => b2 r24: b6 => b2 P2 r21: c2 -> a2 r22: b2 -> a2 r23: b5 => b2 r24: b6 => b2 P2 r21: c2 -> a2 r22: b2 -> a2 r23: b5 => b2 r24: b6 => b2 P2 r21: c2 -> a2 r22: b2 -> a2 r23: b5 => b2 r24: b6 => b2 P2 r21: -> b2 r22: -> c2 r23: b2 => a2 r24: c2 => ¬a2 r23 > r24 P1 r11: a1 -> x1 r12: a2 => a1 r13: a3,a4 => ¬a1 P1 r11: a1 -> x1 r12: a2 => a1 r13: a3,a4 => ¬a1 P1 r11: a1 -> x1 r12: a2 => a1 r13: a3,a4 => ¬a1 P1 r11: a1 -> x1 r12: a2 => a1 r13: a3,a4 => ¬a1 P1 r11: a1 -> x1 r12: a2 => a1 r13: a3,a4 => ¬a1 P1 r11: a1 -> x1 r12: a2 => a1 r13: a3,a4 => ¬a1 P1 r11: a1 -> x1 r12: a2 => a1 r13: a3,a4 => ¬a1 P3 r31: -> a3 P4 r41: -> a4 P3 r31: -> a3 P4 r41: -> a4 P3 r31: -> a3 P4 r41: -> a4 P3 r31: -> a3 P4 r41: -> a4 P3 r31: -> a3

  18. P2P_DRdl Algorithm • Step 1 • Determine ifxi, or its negation¬xiare consequences ofPi's local strict rules • Step 2 • CollectPi’s local and mapping rules that support xi . • Check which of these rules are applicable. For each literal in the body of a supportive rule, issue a similar query (recursive call of the algorithm) to compute its truth value. • To avoid cycles, before each new call, check if the same query has been issued before during the same algorithm call. • For each applicable ruleri,build itssupportive setSSri. This derives from the union of the set of the foreign literals that are contained in the body of riwith the Supportive Sets of the local literals that belong in the body of the same • Build thesupportive set ofxi , SSxi.This will be equal to thestrongestSSri

  19. P2P_DRdl Algorithm • Step 3 • CollectPi’s local and mapping rules that contradict xi. • Build theconflicting set ofxi , CSxi.This will be equal to thesupportive set of ¬xi. • Step 4 • If for each applicable rule that contradicts the queried literal , there is at least one applicable rule that supports the same literal, which is either superior (according to the rule priority relation) or non-inferior but stronger (according to the context preference relation) -> return a positive truth value • In any other case return a negative one • Comparing the strength of two Supportive Sets • A literal ak is considered stronger than bl from Pi 's viewpoint if Pk precedes Pl in Ti. The strength of a set of literals is determined by the weakest literal in this set.

  20. P2P_DRdl - Demonstration A query aboutx1is issued toP1 • Neither x1nor ¬x1derivefrom the local theory of P1 P2 r21: -> b2 r22: -> c2 r23: b2 => a2 r24: c2 => ¬a2 r23 > r24 P1 r11: a1 -> x1 r12: a2 => a1 r13: a3,a4 => ¬a1 P3 r31: -> a3 P4 r41: -> a4

  21. P2P_DRdl - Demonstration • The algorithm successively calls rules r11 and r12 , and issues a query about a2 P2 r21: -> b2 r22: -> c2 r23: b2 => a2 r24: c2 => ¬a2 r23 > r24 P1 r11: a1 -> x1 r12: a2 => a1 r13: a3,a4 => ¬a1 a2? P3 r31: -> a3 P4 r41: -> a4

  22. P2P_DRdl - Demonstration • The algorithm computes a positive truth value for a2, based on the local defeasible theory of P2 and returns it to P1. P2 r21: -> b2 r22: -> c2 r23: b2 => a2 r24: c2 => ¬a2 r23 > r24 P1 r11: a1 -> x1 r12: a2 => a1 r13: a3,a4 => ¬a1 a2? Ansa2=Yes P3 r31: -> a3 P4 r41: -> a4

  23. P2P_DRdl - Demonstration • The algorithm computes SSr12={a2} P2 r21: -> b2 r22: -> c2 r23: b2 => a2 r24: c2 => ¬a2 r23 > r24 P1 r11: a1 -> x1 r12: a2 => a1 r13: a3,a4 => ¬a1 SSr12={a2} P3 r31: -> a3 P4 r41: -> a4

  24. P2P_DRdl - Demonstration • In a similar way, it computes SSr13={a3, a4} P2 r21: -> b2 r22: -> c2 r23: b2 => a2 r24: c2 => ¬a2 r23 > r24 P1 r11: a1 -> x1 r12: a2 => a1 r13: a3,a4 => ¬a1 SSr12={a2} SSr13={a3,a4} Ansa3=Yes a3? a4? Ansa4=Yes P3 r31: -> a3 P4 r41: -> a4

  25. P2P_DRdl - Demonstration • Using T1= [P4, P2, P3] the algorithm determines that SSr12 is stronger than SSr13 and returns a positive answer for a1, and eventually for x1 P2 r21: -> b2 r22: -> c2 r23: b2 => a2 r24: c2 => ¬a2 r23 > r24 P1 r11: a1 -> x1 r12: a2 => a1 r13: a3,a4 => ¬a1 T1=[P4,P2,P3] SSr12={a2} SSr13={a3,a4} Ansa1=Yes Ansx1=Yes P3 r31: -> a3 P4 r41: -> a4

  26. P2P_DRdl Properties • Termination • The algorithm is guaranteed to terminate returning either a positive or a negative answer for the queried literal • Holds due to cycle detection • Number of Messages • The total number of messages exchanged between contexts for the computation of a single query is in the worst case O(nlxn2) • Worst Case • all contexts have defined mappings that involve literals from all other contexts and the evaluation of the query requires checking the mapping rules of all contexts • This is a consequence of two states that we retain for each context, which keep track of the incoming and outgoing queries of the context

  27. P2P_DRdl Properties • Computational Complexity • O(n2 x nl2 x nr + n x nl x nr2) • Worst Case: all contexts have defined mappings that involve literals from all other contexts and the evaluation of the query requires checking the mapping rules of all contexts • Equivalent Defeasible Theory • There is a standard process for unifying the distributed theories in an equivalent global defeasible theory, which produces the same results with the distributed theories under the proof theory of Defeasible Logic. • Assumption: acyclic MCS • Steps • The local rules of each theory are added as strict rules in the unified theory • The local defeasible and mapping rules of each theory are added as defeasible rules in the unified theory • For pairs of rules for which there is priority information in the context theories, we add this information in the unified theory • For each pair of conflicting rules, for which there is not any priority information, we add a priority relation taking into account the context preference relations of the distributed contexts

  28. Synopsis • A representation model that extends the Multi-Context Systems with defeasible mapping rules and with a preference relation • Challenges of reasoning in MCS • the problem of global inconsistency • Algorithm for conflict resolution using context and preference information

  29. Next Steps • Study equivalence between non-acyclic MCS and global defeasible theory using variants of Defeasible Logic with loops • Extend the algorithm to support overlapping vocabularies, which will enable the different context theories to use elements of common vocabularies (e.g. URIs) • Implement the algorithms in LP • Study alternative methods for using context and preference information to resolve global conflicts • Study applications in the Ambient Intelligence and Semantic Web domains

  30. Local and Distributed Defeasible Reasoning in Multi-Context Systems Thank You for your Attention! Any Questions? October, 2008

More Related