Download
agents that reason logically n.
Skip this Video
Loading SlideShow in 5 Seconds..
Agents that reason logically PowerPoint Presentation
Download Presentation
Agents that reason logically

Agents that reason logically

90 Views Download Presentation
Download Presentation

Agents that reason logically

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Agents that reason logically Tuomas Sandholm Carnegie Mellon University Computer Science Department

  2. Agents that reason logically Logic: - formal language in which knowledge can be expressed - means of carrying out reasoning in such a language Knowledge base (KB) consisting of sentences - Background knowledge - TELL’ed function KB-AGENT(percept) returns an action static: KB, a knowledge base t, a counter, initially 0, indicating time TELL(KB,MAKE-PERCEPT-SENTENCE(percept, t)) action ASK(KB,MAKE-ACTION-QUERY(t)) TELL(KB,MAKE-ACTION-SENTENCE(action,t)) t  t+1 returnaction

  3. Sentences Sentences Entails Representation Semantics Semantics World Facts Facts Follows KB  KB i “ is derived from KB by i” Syntax semantics “KB entails ” sentence An inference procedure that generate only entailed sentences is called sound (truth-preserving) Proof = record of operation of sound inference procedure Proof theory specifies the sound inference steps for a logic. An inference procedure is complete if it can find a proof for any entailed sentence.

  4. Inference “The pope is in Denver” Pope = microfilm Denver = pumpkin on the porch A sentence is true under a particular interpretation if the state of affairs it represents is the case A sentence is valid (tautology, necessarily true) if it is true under all possible worlds, i.e. regardless of what it is supposed to mean and regardless of the state of affairs in the universe being described. E.g. A  ¬A A sentence is satisfiable if there is some interpretation of some world for which it is true. E.g. A  B (satisfiable by setting A= True, B=True) Unsatisfiable: E.g. A  ¬A

  5. Epistemological commitment (what an agent believes about facts) Ontological commitment (what exists in the world) Language Propositional logic First-order logic Temporal logic Probability theory Fuzzy logic Facts Facts, objects, relations Facts, objects, relations, times Facts Degree of truth True/false/unknown True/false/unknown True/false/unknown Degree of belief 0…1 Degree of belief 0…1

  6. Propositional Logic (PL): Syntax Sentence AtomicSentence | ComplexSentence AtomicSentence  True | False | P | Q | R | … ComplexSentence  ( Sentence ) | SentenceConnectiveSentence | ¬Sentence Connective  |  |  |  Logic constants Propositional symbols Conjunction (and’ed together) Disjunction (or’ed together) Precedence: ¬     E.g. ¬ P  Q  R  S is equivalent to ((¬ P)  (Q  R))  S

  7. Propositional Logic: Semantics Truth table defines the semantics

  8. Validity and inference Truth tables can be used for inference ((PH)¬H)  P If the sentence is true in every row, then the sentence is valid. This can be used for machine inference by building a truth table for Premises  Conclusions and checking all rows. Slow, so need more powerful inference rules…

  9. Inference rules in propositional logic E.g. to prove that P follows from (PH) and H, we require only one application of the resolution rule with  as P,  as H, and  empty.

  10. Proving soundness of inference rules for propositional logic … The truth-table demonstrating soundness of the resolution inference rule for propositional logic. An inference rule is sound if the conclusion is true in all cases where the premises are true.

  11. Complexity of propositional inference • Truth table method needs to check 2n rows for any proof involving n propositional symbols • NP-Complete [Cook 1971] • 3SAT: ? s.t. (x1x5x6)  (x2x5x6) … • Most instances may be easy • Monotonicity: When we add new sentences to KB, all the sentences entailed by the original KB are still entailed. • Propositional logic (and first-order-logic) are monotonic. • Monotonicity allows local inference rules. • Probability theory is not monotonic.

  12. Complexity of propositional inference: a tractable special case A class of sentences that allow polynomial time inference in propositional logic: Horn sentence: P1 P2  …  Pn  Q where Pi’s and Q are non-negated Inference procedure: apply Modus Ponens whenever possible until no more inferences possible.

  13. Models (= dark regions in the Venn diagrams below) = those parts of the world where sentence is true. I.e. those assignments of {True,False} to propositions. A sentence  is entailed by a KB if the models of KB are all models of .

  14. Another method for inference in propositional logic: Model finding Postulate  (Premises  Conclusions) and try to find a model

  15. Applications of model finding • Logic, theorem proving (e.g. Robbins algebra) • Planning (e.g. SATPLAN) • Boolean circuits • Satisfiability checking • Constraint satisfaction • Vision interpretation [e.g. Reiter & Mackworth 89]

  16. Model finding algorithms

  17. p1 clause T p2 F T F p3 p4 Davis-Putnam procedure [1960] E.g. for 3SAT ? s.t. (p1p3p4)  (p1p2p3)  … Backtrack when some clause becomes empty Unit propagation (for variable & value ordering): if some clause only has one literal left, assign that variable the value that satisfies the clause (never need to check the other branch) Complete

  18. A helpful observation for the Davis-Putnam procedure P1 P2  …  Pn  Q (Horn) is equivalent to (P1  P2  …  Pn)  Q (Horn) is equivalent to P1  P2  …  Pn  Q (Horn clause) Thrm. If a propositional theory consists only of Horn clauses (i.e., clauses that have at most one non-negated variable) and unit propagation does not result in an explicit contradiction (i.e., Pi and Pi for some Pi), then the theory is satisfiable. Proof. On the next page. …so, Davis-Putnam algorithm does not need to branch on variables which only occur in Horn clauses

  19. Proof of the thrm Assume the theory is Horn, and that unit propagation has completed (without contradiction). We can remove all the clauses that were satisfied by the assignments that unit propagation made. From the unsatisfied clauses, we remove the variables that were assigned values by unit propagation. The remaining theory has the following two types of clauses that contain unassigned variables only: P1  P2  …  Pn  Q and P1  P2  …  Pn Each remaining clause has at least two variables (otherwise unit propagation would have applied to the clause). Therefore, each remaining clause has at least one negated variable. Therefore, we can satisfy all remaining clauses by assigning each remaining variable to False.

  20. Variable ordering heuristic for the Davis-Putnam procedure [Crawford & Auton AAAI-93] Heuristic: Pick a non-negated variable that occurs in a non-Horn (more than 1 non-negated variable) clause with a minimal number of non-negated variables. Motivation: This is effectively a “most constrained first” heuristic if we view each non-Horn clause as a “variable” that has to be satisfied by setting one of its non-negated variables to True. In that view, the branching factor is the number of non-negated variables the clause contains. Q: Why is branching constrained to non-negated variables? A: We can ignore any negated variables in the non-Horn clauses because • whenever any one of the non-negated variables is set to True the clause becomes redundant (satisfied), and • whenever all but one of the non-negated variables is set to False the clause becomes Horn. Variable ordering heuristics can make several orders of magnitude difference in speed.

  21. “Order parameter” for 3SAT [Mitchell, Selman, Levesque AAAI-92] • b = #clauses / # variables • This predicts • satisfiability • hardness of finding a model

  22. Generality of the order parameter b • The results seem quite general across model finding algorithms • Other constraint satisfaction problems have order parameters as well

  23. …but the complexity peak does not occur under all ways of generating the 3SAT instances

  24. Avg. total flips 50 variables, 215 3SAT clauses 2000 1600 1200 800 400 max-climbs 100 200 GSAT [Selman, Levesque, Mitchell AAAI-92](= a local search algorithm for model finding) Incomplete (unless restart a lot) Greediness is not essential as long as climbs and sideways moves are preferred over downward moves.

  25. Restarting vs. Escaping

  26. BREAKOUT algorithm [Morris AAAI-93] Initialize all variables Pi randomly UNTIL currently state is a solution IF current state is not a local minimum THEN make any local change that reduces the total cost (i.e. flip one Pi) ELSE increase weights of all unsatisfied clause by one Incomplete, but very efficient on large (easy) satisfiable problems. Reason for incompleteness: the cost increase of the current local optimum spills to other solutions because they share unsatisfied clauses.

  27. Summary of the algorithms we covered for inference in propositional logic • Truth table method • Inference rules • Model finding algorithms • Davis-Putnam (Systematic backtracking) • Early backtracking when a clause is empty • Unit propagation • Variable (& value?) ordering heuristics • GSAT • BREAKOUT

  28. Propositional logic is too weak a representational language • Too many propositions to handle, and truth table has 2n rows. E.g. in the wumpus world, the simple rule “don’t go forward if the wumpus is in front of you” requires 64 rules ( 16 squares x 4 orientations for agent) • Hard to deal with change. Propositions might be true at times but not at others. Need a proposition Pit for each time step because one should not always forget what held in the past (e.g. where the agent came from) • don’t know # time steps • need time-dependent versions of rules • Hard to identify “individuals”, e.g. Mary, 3 • Cannot directly talk about properties of individuals or relations between individuals, e.g. Tall(bill) • Generalizations, patterns cannot easily be represented “all triangles have 3 sides.”