1 / 55

LOGIC PROGRAMMING (WEEK 2)

LOGIC PROGRAMMING (WEEK 2). Lecture 4. Objectives. • To Look in more detail at the Resolution Algorithm. • To understand and appreciate the simplicity and elegance of this rule of inference. • To appreciate why the simplicity of the inference rule is important.

ebony-bowen
Download Presentation

LOGIC PROGRAMMING (WEEK 2)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. LOGIC PROGRAMMING(WEEK 2)

  2. Lecture 4

  3. Objectives • To Look in more detail at the Resolution Algorithm. • To understand and appreciate the simplicity and elegance of this rule of inference. • To appreciate why the simplicity of the inference rule is important. • To illustrate the operation of the resolution algorithm through two worked examples.

  4. Resolution I Having converted your predicate logic wff into clausal form the resolution procedure can now exploit this stylised representation to automate the inference process. Resolution consists of a simple iterative process. In its generalist form it can be represented by the Pseudo Code contained within Figure 7.1. Assuming that S is a set of axioms known to be true and P is an axiom whose validity is to be proven.

  5. Resolution Algorithm PROCEDURE Resolution (S,P); BEGIN Convert_to_clausal_form (S); Negate(P); augment(S,¬ P); REPEAT Select(parent_1) Select(parent_2) Resolve(parent_1,parent_2, resolvent); augment (S,resolvent) UNTIL (resolvent = empty clause) OR (no_resolvable_pair_of_clause) OR (a predetermined amount of effort has been expended) IF (resolvent = empty clause) THEN P TRUE IF (no_resolvable clauses) THEN P FALSE END; (*Resolution*)

  6. A Simple Example Consider the following wff that has been converted to CNF:- ¬ B v A C v B ¬ C Assume we wish to prove A. Consequently we add the following clause to our set of clauses. ¬ B v A (1) C v B (2) ¬ C (3) ¬ A (4) We select two clauses to resolve.

  7. Simple Example II Lets select clauses (1) and (4) an arbitrary decision at this juncture. Given that there is an implied AND all four clauses must be true in order for entire wff to be true. Hence if ¬ A (ie 4) is true then A is false. Considering clause (1) ¬ B v A then in order for it to be true then ¬ B must be true. Both clauses are said to have been resolved the resolvent being ¬ B. Hence we add ¬ B to the set of known axioms. ¬ B (1) C v B (2) ¬ C (3)

  8. Simple Example III A second iteration could resolve (1) and (2) producing a resolvent C. Adding C to the set of known axioms S gives:- C (1) ¬ C (2) Finally resolution of the remaining clauses produces a null or empty clause we have achieved a Null resolvent. If a null clause is produced we conclude our resolution with the conclusion that the original axiom is true ie A is true.

  9. The Resolution Principle Robinsons Resolution rule in its most general form states that given ¬ X v Y and X v Z we can derive Y v Z. [1]

  10. A Predicate Calculus Example Looking at a, more realistic example. This example involves the manipulation of predicate calculus clauses rather than the propositions we were considering in the previous example. ¬ hates (U,V) v dislikes (U,V) (1) ¬ dislikes (X,Y) v ¬ dislikes (Y,Z) v dislike (X,Z) (2) hates (mary, john) (3) hates (john, joe) (4) ¬ dislikes (mary, joe) (5) (negated axiom) If we were to resolve (2) and (5) we would produce the resolvent ¬ dislikes(mary, Y) v ¬ dislikes(Y, joe) (6)

  11. A Predicate Example II This is achieved by assuming the variable X = mary and Z = joe If we in turn subsequently resolving (1) and (6) Replacing U with Y and V being assumed equal to joe. We get.... ¬ hates (Y,joe) v dislikes (Y,joe) ¬ dislikes (mary,Y) v ¬ dislikes (Y,joe) the resolvent becomes ¬ hates (Y,joe) v ¬ dislikes (mary,Y) (7)

  12. Predicate Calculus Example III Resolution of (1) and (7) would produce assuming U = mary V = Y ¬ hates (mary,Y) v dislikes (mary,Y) (1) ¬ hates (Y,joe) v ¬ dislikes (mary,Y) (7) ¬ hates (mary,Y) v ¬ hates (Y,joe) (8)

  13. Pedicate Calculus Example IV Resolving (3) and (8) would produce hates (mary,john) (3) ¬ hates (mary,john) v ¬ hates (john,joe) (8) Produces ¬ hates (john,joe) (9) Finally resolving (4) and (9) produces a null clause and hence proves our original axiom to be true namely dislikes (mary,joe)

  14. Resolution I Resolution exhibits true important attributes namely those of soundness and completeness. It is logically sound since the resolvent of two clauses logically follows. For example consider the following clauses p v q (1) q v s (2) Resolving (1) and (2) will produce the resolvent p v s (1) and (2) are axioms and as such are known to be true. Examining the potential values of q.

  15. Resolution II q p v q q v s TRUE TRUE S MUST BE TRUE IN ORDER TO BE TRUE FALSE P MUST BE TRUE IN TRUE ORDER TO BE TRUE To be sure that these expressions are true irrespective of the value of q then either s must be true or p must be true. Thus if q is eliminated then we can say p v s must be true.

  16. Resolution III Robinsons Resolution algorithm is complete because if the original set of axioms and the negated wff are unsatisfiable then the empty set will be produced eventually.

  17. Lecture 5

  18. Control within Resolution The question which must be addressed is how do we decide which of the clauses within the set of axioms S are to be used as parents in a resolution. Resolution algorithms invariably use some systematic selection mechanism and consequently they will find a contradiction if one exists. Some strategies for parent selection result in an improvement in the time taken to locate such contradictions. It should be stated that the order clauses are resolved together will have no affect on the final outcome, that is satisfiable or unsatisfiable, however it will most definitely affect the length of the proof used in obtaining the outcome. The term Control is often used to describe the process of picking your way through a search space. Some of these control strategies are very simple .......

  19. Set of Support Strategy Where one of the parents such be either i) One of the clauses which is part of the statement we are trying to refute or ii) a resolvent produced by the resolution of one of the component clauses of the negated clause with another clause This strategy is loosely based on the intuition that the contradiction we are looking for must involve the negated clause.

  20. Unit Preference Strategy This strategy chooses as its parent clauses which contain a single literal, or in their absence the clause with the fewest literals. Obviously such a strategy will produce resolvents that generally are shorter (contain fewer literals) than the longest of the parents. It therefore seems to strive quickly for the goal that of a resolvent with no literals.

  21. Breadth First Strategy Takes all the axioms in S and identifies all the resolvents possible. It then identifies all the possible resolvents of the original set-S and the new resolvents. This process continues until a Nil resolvent is produced.

  22. Linear Resolution Strategy Linear resolution involves originally selecting two parents for resolution, this produces a resolvent which we use as one of the parents in the next resolution, which in turn yields a resolvent which acts as a parent in the subsequent resolution. Varying criteria can be used to select the other parent clause. We have already considered two possible contenders. Prolog employs a more restricted form of Linear Resolution the details of which we will address later.

  23. SLD Resolution Most logic programming systems impose restrictions on the form of resolution they use. Typically:- (i) One parent must be a Definite Clause namely an assertion or implication. (ii) The other clause must be the most recent resolvent and must be a goal. (iii) There must exist some rule which will facilitate the unique selection of the former of these parents and which clause within the most recent resolvent will form the other parents. (ii) and (iii) are attributes characteristic of SL-Resolution :- Linear resolution with a Selector function. The addition of (i) results in SLD - Resolution which was formerly known as LUSH Resolution. [2] [3].

  24. References [1] Robinson, J.A., A Machine-Oriented Logic Based on the Resolution Principle, Journal of ACM, Vol. 12, No. 1, Jan. 1965. [2] Kowalski, R.A., Predicate Logic as a programming language, In Proc. of IFIP - 74 North Holland Pub, 1974. [3] Kowalski, R.A. and Kuehner, D.G., Linear Resolution with Selector Function, Artificial Intelligence 2, 1971.

  25. Linear Input Resolution I Prolog uses a particular form of resolution that of Linear Input Resolution. It restricts the choice of resolvents as follows. Initially we resolve the original goal or query with one of the original set of assertions/implications S. The resultant resolvent is subsequently resolved with a clause in S. This process continues until the clause to be satisfied is an empty clause. Never is the resolvent resolved with another resolvent but only with the original set S. Nor should two clauses within S be resolved together.

  26. Linear Input Resolution II Many possible strategies or hybrid strategies exist. All those mentioned to date are complete - since they are guaranteed to find a proof if one exists. They are however also SEMI DECIDABLE in that they will only tell us if it can derive a nil resolvent not if it cannot. Not all strategies are complete, for instance always using an original input clause as one of your parents is an INCOMPLETE strategy. Unit resolution strategy - choose as one parent a unit clause is clearly incomplete.

  27. Linear Input Resolution III Consider the following set S wears (williams, white_coat). has (williams, phd). tall (jones). tall (williams). doctor(X) :- wears(X, white_coat), has(X,phd). doctor (andrews). with the following query or goal doctor (x), tall (x). The first clause in the query would be resolved upon the doctor (x) implication. The new resolvent becoming wears (x, white_coat), has (x,phd) This is added to the remainder of the previous resolvent.

  28. Linear Input Resolution IV If we conceive of the query as a list of clauses which need to be solved in order that the original query can be considered true then...... We should consider very carefully the implications of the scope of the variables......

  29. The Evolving Query List

  30. Linear Input Resolution V This process of resolving one of the clauses on this list continues with the list contracting when the other parent is an assertion. The process terminates when the list is equal to nil or no further resolutions are possible. This specific form of Linear Input Resolution results in Prolog adopting a depth first strategy. This strategy can however result in cycles where alternative routes in the tree never get considered. It is nevertheless simpler and less computationally expensive. The alternative of a breadth first strategy would produce a solution if one exists.

  31. Prolog Resolution This Linear input resolution is more general than that employed by Prolog. Prolog differs in two important respects. (i) The resolvent produced from the most recent resolution is always added to the front of the query list. (ii) The clause chosen form the query list for resolution is always retrieved rom the beginning of the list. This specific form of Linear Input Resolution results in Prolog adoping a Depth First, Left to Right Search Strategy.

  32. Refutation Graphs I Often resolution proofs can be depicted using a graph like notation known as refutation graphs. Every node in the graph represents a clause. In particular every leaf node represents a clause taken from the augmentation of the original set of axioms S and the negated clause(s) added. Intermediate nodes represent resolvents of the two parents. The root of a refutation graph will be nil if the set of axioms are unsatisfiable. With reference to the following set of axioms.... P(X) v Q(X) (1) ¬Q(X) v Z(X) (2) ¬Z(Y) (3) ¬P(V) v R(Y) (4) ¬R(U) (5) One particular refutation graph is depicted in Figure 9.1

  33. Refutation Graphs II The particular refutation graph has four resolutions. Normally arcs are labelled in some consistent manner indicating substitutions or unifications that permit the particular resolution to take place.

  34. Refutation Graphs III

  35. Refutation Graphs IV If the aforementioned set of axioms were to be augmented such that clause (2) became..... ¬Q(X) v Z(X) v T(M) (2) and we added two further axioms thus.... ¬T(O) v W(C) v ¬(R(U) (6) ¬W(Q) v Z(S) (7) then the refutation graph could be considered as a true graph as can be seen within Figure 9.2.

  36. Refutation Graphs V

  37. Things To Do List • Ensure You Understand How to draw different Refutation Graphs depicting differing control strategies. • Attempt Q5 1989 Past Paper • Attempt Q3 1993 Past Paper • Understand the particular form of Linear Input Resolution Used by Prolog and be able to explain it.

  38. Unification Algorithm - Data Structures TYPE atom_kind = (const_id, number, string); atom = RECORD CASE kind: atom_kind OF const_id: (name : identifier_name); number : (n_: natural); string : (s : string) END; termkind = (atom_term, var_term, comp_term); term = RECORD CASE kind : termkind OF atom_term: (an_atom : atom); var_term : (a_var: identifier_name); comp_term: (a_comp : composite) END;

  39. Legal Unifications ?

  40. Lecture 6

  41. The Occurs Check I Such a feature prevents recursive bindings occurring. It is true to say that a variable and a composite may be unified, however care must be taken in order to ensure that the composite is void of any occurrence of the unifying variable. For example consider the following clauses ...... equal (x). equal (f(x)). Unifying with the substitution x:= f(x) produces a recursive binding thus.... equal (f(f(f(f(f(x)))))

  42. The Occurs Check II While the original unification algorithm included an occurs check, most Prolog systems omit such a feature. Given that composite are comprised of elements which may indeed be composites there can often be a high degree of nesting. Consequently the checking for the existence of a particular variable every time a unification takes place proves very expensive computationally. The result is often an algorithm with a significantly increased time complexity. This combined with the fact that recursive bindings occur infrequently explains why most Prolog systems fail to make such a provision.

  43. The Occurs Check III If such a recursive binding arises then you can think conceptually of this as an infinite tree rather than the usual case here the refutation graph is that of a Directed Acyclic Graph (DAG) (more specifically tree). One Prolog System (Prolog II) includes such a check and indeed makes use of such cyclic structures allowing you to store things as INFINITE TREES.

  44. Unification Data Structures II termlist = ^termnode; termnode = RECORD listhead : term; listtail : termlist END; composite = RECORD functor : identifier_name; terms : termlist END;

  45. Unification Data Structures III bind_pair = RECORD first_item : term; second_item : term END; bind_list = ^binding; binding = RECORD bding_head : bind_pair; binds : bind_list END;

  46. Unification - The Algorithm I FUNCTION unify (T1 : TERM; T2 : TERM ) : bind_list; BEGIN CASE T1.kind OF atom_term : unify := atom_unify (t1, t2); var_term : unify := var_unify (t1, t2); comp_term : unify := comp_unify (t1, t2) END END; (* unify *)

  47. Unification Algorithm II FUNCTION var_unify(t1 : term; t2 : term ) : bind_list; BEGIN var_unify := bind_terms (t1, t2) END; (* var_unify *) FUNCTION atom_unify (t1 : term; t2 : term ) : bind_list; BEGIN WITH t1.an_atom DO CASE kind OF const_id : atom_unify := const_unify (t1, t2); number : atom_unify := numb_unify (t1, t2); string : atom_unify := string_unify (t1, t2) END END; (* atom_unify *)

  48. Unification Algorithm III FUNCTION comp_unify (t1 : term; t2 : term ) : bind_list; BEGIN CASE t2.kind OF atom_term : comp_unify := nil; var_term : comp_unify := bind_terms (t1, t2); comp_term : BEGIN same_functor := (t1.a_comp.functor = t2..a_comp.functor); t1_arity := arity (t1); t2_arity := arity (t2); same_arity := (t1_arity = t2_arity); IF (same_functor AND same_arity) THEN BEGIN comp_unify:= Match-arguments END ELSE comp_unify := nil END END END; (* comp_unify *)

  49. Unification Algorithm IV Where matching each of the arguments could be achieved by ..... FUNCTION Match-arguments : bind_list; BEGIN • • • • mismatch := false; first_tlist:= t1.a_comp.terms; sec_tlist:= t2.a_comp.terms; WHILE ((NOT(mismatch) AND (first_tlist <> nil)) DO BEGIN result := unify(first_tlist^.listhead, sec_tlist^.listhead); mismatch := (result = nil); first_tlist := first_tlist^.listtail; sec.tlist := sec.tlist^.listtail END; • • • • IF mismatch THEN comp_unify := nil ELSE comp_unify := result • • • • END; (* Match_arguments *)

  50. Resolution and Unification FUNCTION resolution : bind_list; BEGIN • • • • unify(t1,t2); • • • • END;

More Related