1 / 401

Knowledge Representation and Reasoning  Representação do Conhecimento e Raciocínio

Knowledge Representation and Reasoning  Representação do Conhecimento e Raciocínio. José Júlio Alferes. Part 1: Introduction. What is it ?. What data does an intelligent “agent” deal with? - Not just facts or tuples.

gavin
Download Presentation

Knowledge Representation and Reasoning  Representação do Conhecimento e Raciocínio

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Knowledge Representation and ReasoningRepresentação do Conhecimento e Raciocínio José Júlio Alferes

  2. Part 1: Introduction

  3. What is it ? • What data does an intelligent “agent” deal with? - Not just facts or tuples. • How does an “agent” knows what surrounds it? What are the rules of the game? • One must represent that “knowledge”. • And what to do afterwards with that knowledge? How to draw conclusions from it? How to reason? • Knowledge Representation and Reasoning  AI Algorithms and Data Structures  Computation

  4. What is it good for ? • Fundamental topic in Artificial Intelligence • Planning • Legal Knowledge • Model-Based Diagnosis • Expert Systems • Semantic Web (http://www.w3.org) • Reasoning on the Web (http://www.rewerse.com) • Ontologies and data-modeling

  5. What is this course about? • Logic approaches to knowledge representation • Issues in knowledge representation • semantics, expressivity, complexity • Representation formalisms • Forms of reasoning • Methodologies • Applications

  6. Bibliography • Will be pointed out as we go along (articles, surveys) in the summaries at the web page • For the first part of the syllabus: • Reasoning with Logic Programming J. J. Alferes and L. M. Pereira Springer LNAI, 1996 • Nonmonotonic Reasoning G. Antoniou MIT Press, 1996.

  7. What prior knowledge? • Computational Logic • Introduction to Artificial Intelligence • Logic Programming

  8. Logic for KRR • Logic is a language conceived for representing knowledge • It was developed for representing mathematical knowledge • What is appropriate for mathematical knowledge might not be so for representing common sense • What is appropriate for mathematical knowledge might be too complex for modeling data.

  9. Mathematical knowledge vs common sense • Complete vs incomplete knowledge • " x : x Î N → x Î R • go_Work → use_car • Solid inferences vs default ones • In the face incomplete knowledge • In emergency situations • In taxonomies • In legal reasoning • ...

  10. Monotonicity of Logic • Classical Logic is monotonic T |= F → T U T’ |= F • This is a basic property which makes sense for mathematical knowledge • But is not desirable for knowledge representation in general!

  11. Non-monotonic logics • Do not obey that property • Appropriate for Common Sense Knowledge • Default Logic • Introduces default rules • Autoepistemic Logic • Introduces (modal) operators which speak about knowledge and beliefs • Logic Programming

  12. Logics for Modeling • Mathematical 1st order logics can be used for modeling data and concepts. E.g. • Define ontologies • Define (ER) models for databases • Here monotonicity is not a problem • Knowledge is (assumed) complete • But undecidability, complexity, and even notation might be a problem

  13. Description Logics • Can be seen as subsets of 1st order logics • Less expressive • Enough (and tailored for) describing concepts/ontologies • Decidable inference procedures • (arguably) more convenient notation • Quite useful in data modeling • New applications to Semantic Web • Languages for the Semantic Web are in fact Description Logics!

  14. In this course (revisited) • Non-Monotonic Logics • Languages • Tools • Methodologies • Applications • Description Logics • Idem…

  15. Part 2: Default and Autoepistemic Logics

  16. Default Logic • Proposed by Ray Reiter (1980) go_Work → use_car • Does not admit exceptions! • Default rules go_Work : use_car use_car

  17. More examples anniversary(X)  friend(X) : give_gift(X) give_gift(X) friend(X,Y)  friend(Y,Z) : friend (X,Z) friend(X,Z) accused(X) : innocent(X) innocent(X)

  18. Default Logic Syntax • A theory is a pair (W,D), where: • W is a set of 1st order formulas • D is a set of default rules of the form: j : Y1, … ,Yn g • j (pre-requisites), Yi (justifications) and g (conclusion) are 1st order formulas

  19. The issue of semantics • If j is true (where?) and all Yi are consistent (with what?) then g becomes true (becomes? Wasn’t it before?) • Conclusions must: • be a closed set • contain W • apply the rules of D maximally, without becoming unsupported

  20. Default extensions • G(S) is the smallest set such that: • W G(S) • Th(G(S)) = G(S) • A:Bi/C  D, A G(S) and Bi  S → C G(S) • E is an extension of (W,D) iff E = G(E)

  21. Quasi-inductive definition • E is an extension iff E = Ui Ei for: • E0 = W • Ei+1 = Th(Ei) U {C: A:Bj/C  D, A  Ei, Bj E}

  22. Some properties • (W,D) has an inconsistent extension iff W is inconsistent • If an inconsistent extension exists, it is unique • If W  Just  Conc is inconsistent , then there is only a single extension • If E is an extension of (W,D), then it is also an extension of (W  E’,D) for any E’  E

  23. Operational semantics • The computation of an extension can be reduced to finding a rule application order (without repetitions). • P = (d1,d2,...) and P[k] is the initial segment of P with k elements • In(P) = Th(W  {conc(d) | dP}) • The conclusions after rules in P are applied • Out(P) = {Y | Y just(d) and dP } • The formulas which may not become true, after application of rules in P

  24. Operational semantics (cont’d) • d is applicable in P iff pre(d)  In(P) and Y In(P) • P is a process iff dkP, dk is applicable in P[k-1] • A process P is: • successful iff In(P) ∩ Out(P) = {}. • Otherwise it is failed. • closed iff d D applicable in P→dP • Theorem: E is an extension iff there exists P, successful and closed, such that In(P) = E

  25. Computing extensions (Antoniou page 39) extension(W,D,E) :- process(D,[],W,[],_,E,_). process(D,Pcur,InCur,OutCur,P,In,Out) :- getNewDefault(default(A,B,C),D,Pcur), prove(InCur,[A]), not prove(InCur,[~B]), process(D,[default(A,B,C)|Pcur],[C|InCur],[~B|OutCur],P,In,Out). process(D,P,In,Out,P,In,Out) :- closed(D,P,In), successful(In,Out). closed(D,P,In) :- not (getNewDefault(default(A,B,C),D,P), prove(In,[A]), not prove(In,[~B]) ). successful(In,Out) :- not ( member(B,Out), member(B,In) ). getNewDefault(Def,D,P) :- member(Def,D), not member(Def,P).

  26. Normal theories • Every rule has its justification identical to its conclusion • Normal theories always have extensions • If D grows, then the extensions grow (semi-monotonicity) • They are not good for everything: • John is a recent graduate • Normally recent graduates are adult • Normally adults, not recently graduated, have a job (this cannot be coded with a normal rule!)

  27. Problems • No guarantee of extension existence • Deficiencies in reasoning by cases • D = {italian:wine/wine french:wine/wine} • W ={italian v french} • No guarantee of consistency among justifications. • D = {:usable(X),  broken(X)/usable(X)} • W ={broken(right) v broken(left)} • Non cummulativity • D = {:p/p, pvq:p/p} • derives p v q, but after adding p v q no longer does so

  28. Auto-Epistemic Logic • Proposed by Moore (1985) • Contemplates reflection on self knowledge (auto-epistemic) • Allows for representing knowledge not just about the external world, but also about the knowledge I have of it

  29. Syntax of AEL • 1st Order Logic, plus the operator L (applied to formulas) • Lj means “I know j” • Examples: MScOnSW →L MScSW (or  L MScOnSW → MScOnSW) young (X) Lstudies (X) → studies (X)

  30. Meaning of AEL • What do I know? • What I can derive (in all models) • And what do I not know? • What I cannot derive • But what can be derived depends on what I know • Add knowledge, then test

  31. Semantics of AEL • T* is an expansion of theory T iff T* = Th(T{Lj : T* |= j}  {Lj : T* |≠j}) • Assuming the inference rule j/Lj : T* = CnAEL(T  {Lj : T* |≠j}) • An AEL theory is always two-valued in L, that is, for every expansion: j | Lj T* Lj T*

  32. Knowledge vs. Belief • Belief is a weaker concept • For every formula, I know it or know it not • There may be formulas I do not believe in, neither their contrary • The Auto-Epistemic Logic of knowledge and belief (AELB), introduces also operator B j – I believe in j

  33. AELB Example • I rent a film if I believe I’m neither going to baseball nor football games Bbaseball Bfootball → rent_filme • I don’t buy tickets if I don’t know I’m going to baseball nor know I’m going to football  Lbaseball  Lfootball → buy_tickets • I’m going to football or baseball baseball  football • I should not conclude that I rent a film, but do conclude I should not buy tickets

  34. Axioms about beliefs • Consistency Axiom B • Normality Axiom B(F → G) → (B F →B G) • Necessitation rule F B F

  35. Minimal models • In what do I believe? • In that which belongs to all preferred models • Which are the preferred models? • Those that, for one same set of beliefs, have a minimal number of true things • A model M is minimal iff there does not exist a smaller model N, coincident with M on Bj e Lj atoms • When j is true in all minimal models of T, we write T |=minj

  36. AELB expansions • T* is a static expansion of T iff T* = CnAELB(T  {Lj : T* |≠j}  {Bj : T* |=minj}) where CnAELB denotes closure using the axioms of AELB plus necessitation for L

  37. The special case of AEB • Because of its properties, the case of theories without the knowledge operator is especially interesting • Then, the definition of expansion becomes: T* = YT(T*) where YT(T*) = CnAEB(T  {Bj : T* |=minj}) and CnAEB denotes closure using the axioms of AEB

  38. Least expansion • Theorem: Operator Y is monotonic, i.e. T  T1 T2→YT(T1) YT(T2) • Hence, there always exists a minimal expansion of T, obtainable by transfinite induction: • T0 = CnAEB(T) • Ti+1 = YT(Ti) • Tb = Ua < b Ta (for limit ordinals b)

  39. Consequences • Every AEB theory has at least one expansion • If a theory is affirmative (i.e. all clauses have at least a positive literal) then it has at least a consistent expansion • There is a procedure to compute the semantics

  40. Part 3: Logic Programming for Knowledge representation 3.1 Semantics of Normal Logic Programs

  41. LP forKnowledge Representation • Due to its declarative nature, LP has become a prime candidate for Knowledge Representation and Reasoning • This has been more noticeable since its relations to other NMR formalisms were established • For this usage of LP, a precise declarative semantics was in order

  42. Language • A Normal Logic Programs P is a set of rules: H ¬A1, …, An, not B1, … not Bm (n,m ³ 0) where H, Ai and Bj are atoms • Literal not Bj are called default literals • When no rule in P has default literal, P is called definite • The Herbrand base HP is the set of all instantiated atoms from program P. • We will consider programs as possibly infinite sets of instantiated rules.

  43. Declarative Programming • A logic program can be an executable specification of a problem member(X,[X|Y]). member(X,[Y|L])¬ member(X,L). • Easier to program, compact code • Adequate for building prototypes • Given efficient implementations, why not use it to “program” directly?

  44. flight from to flight ( lisbon , adam ). Lisbon Adam Þ flight ( lisbon , london ) Lisbon London M M M ¬ connection ( A , B ) flight ( A , B ). ¬ connection ( A , B ) flight ( A , C ), connection ( C , B ). ¬ chooseAnot her ( A , B ) not connection ( A , B ). LP and Deductive Databases • In a database, tables are viewed as sets of facts: • Other relations are represented with rules:

  45. ¬ connection ( A , B ) flight ( A , B ). ¬ connection ( A , B ) flight ( A , C ), connection ( C , B ). ¬ chooseAnot her ( A , B ) not connection ( A , B ). LP and Deductive DBs (cont) • LP allows to store, besides relations, rules for deducing other relations • Note that default negation cannot be classical negation in: • A form of Closed World Assumption (CWA) is needed for inferring non-availability of connections

  46. ¬ flies ( A ) bird ( A ), not abnormal ( A ) . ¬ bird ( P ) penguin ( P ). ¬ abnormal ( P ) penguin ( P ). bird ( a ). penguin ( p ). Default Rules • The representation of default rules, such as “All birds fly” can be done via the non-monotonic operator not

  47. The need for a semantics • In all the previous examples, classical logic is not an appropriate semantics • In the 1st, it does not derive not member(3,[1,2]) • In the 2nd, it never concludes choosing another company • In the 3rd, all abnormalities must be expressed • The precise definition of a declarative semantics for LPs is recognized as an important issue for its use in KRR.

  48. 2-valued Interpretations • A 2-valued interpretation I of P is a subset of HP • A is true in I (ie. I(A) = 1) iff AÎ I • Otherwise, A is false in I (ie. I(A) = 0) • Interpretations can be viewed as representing possible states of knowledge. • If knowledge is incomplete, there might be in some states atoms that are neither true nor false

  49. 3-valued Interpretations • A 3-valued interpretation I of P is a set I = T U not F where T and F are disjoint subsets of HP • A is true in I iff A Î T • A is false in I iff AÎ F • Otherwise, A is undefined (I(A) = 1/2) • 2-valued interpretations are a special case, where: HP = T U F

  50. Models • Models can be defined via an evaluation function Î: • For an atom A, Î(A) = I(A) • For a formula F, Î(not F) = 1 - Î(F) • For formulas F and G: • Î((F,G)) = min(Î(F), Î(G)) • Î(F ¬ G)= 1 if Î(F) £ Î(G), and = 0 otherwise • I is a model of P iff, for all rule H ¬ B of P: Î(H ¬ B) = 1

More Related