cs 2710 issp 2610 n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
CS 2710, ISSP 2610 PowerPoint Presentation
Download Presentation
CS 2710, ISSP 2610

Loading in 2 Seconds...

play fullscreen
1 / 39

CS 2710, ISSP 2610 - PowerPoint PPT Presentation


  • 107 Views
  • Uploaded on

CS 2710, ISSP 2610. Chapter 7 Propositional Logic Reasoning. Knowledge Based Agents. Central component: knowledge base , or KB . A set of sentences in a knowledge representation language Generic Functions TELL (add a fact to the knowledge base)

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'CS 2710, ISSP 2610' - eydie


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
cs 2710 issp 2610

CS 2710, ISSP 2610

Chapter 7

Propositional Logic

Reasoning

knowledge based agents
Knowledge Based Agents
  • Central component: knowledge base, or KB.
  • A set of sentences in a knowledge representation language
  • Generic Functions
    • TELL (add a fact to the knowledge base)
    • ASK (get next action based on info in KB)
  • [Inference?]
slide3
Note
  • We’re all familiar with logic from mathematics
  • In AI, we need to think about things we usually take for granted
  • To use the notion of prime number, you don’t have to think about what prime means … or number, for that matter
  • In AI, we create knowledge representation schemes, which define the objects, predicates, and functions which exist
the wumpus world
The Wumpus World
  • Game is played in a MxN grid
  • One player, one wumpus, one or more pits
  • The wumpus eats anyone who enters its room
  • A pit traps anyone who moves into its room
  • (The wumpus can be shot by an agent, but the agent has only one arrow; this and some other details won’t come up in these notes)
  • Player starts in 0,0
  • Forward, turn left, turn right
  • If player tries to move forward and bumps into a wall, the player does not move
  • Player can grab the gold if they are in the same room
  • Goal: find gold while avoiding wumpus and pits
sensors
Sensors
  • 5 sensors, each which gives one bit of information
    • Stench (if wumpus and player in adjacent, not diagonal, squares)
    • Breeze (if a pit and player in adjacent squares)
    • Glitter (if gold and player in the same room)
    • Bump (when the agent walks into a wall)
    • Scream (when wumpus is killed; sound can be perceived everywhere)
  • Percepts: 5-tuples
    • [stench, breeze, glitter, bump, scream]
environment
Environment?
  • Discrete, static, single-agent (wumpus doesn’t move)
  • Sequential (player’s decisions can affect future ones)
  • Partially observable (locations of pits and wumpus are unobservable)
slide7

0

0

wumpus
Wumpus
  • Main difficulty: player doesn’t know locations of pits, gold, and wumpus
  • Reason about configuration
  • Knowledge evolves as new percepts arrive and actions are taken.
examples of reasoning diagram in lecture
Examples of reasoning [diagram in lecture]
  • Player starts in (0,0). No stench or breeze. So, no pit or wumpus in (1,0) or (0,1). Player still alive, so no pit or wumpus in (0,0). Player moves to (1,0)
  • Breeze in (1,0). So, there must be a pit in (0,0), (2,0) or (1,1). Player already ruled out (0,0)
  • Since there is danger in moving to (2,0) and (1,1), player moves back to (0,0)
  • Player moves to (0,1). No positive percepts. Now player knows there is no pit in (1,1) yeah! and there is a pit in (2,0) (pit-def in diagram)

Player combines knowledge gained at different times to make inferences

fundamental concepts of logical representation and reasoning
Fundamental Concepts of logical representation and reasoning
  • Information is represented in sentences, which must have correct syntax

( 1 + 2 ) * 7 = 21 vs. 2 ) + 7 = * ( 1 21

  • The semantics of a sentence defines its truth with respect to each possible world – an interpretation assigning T or F to all propositions
  • W is a model of S means that sentence S is true under interpretation W
  • [What do the following mean?]
    • X |= Y
    • X entails Y
    • Y logically follows from X
which are true which are not true but useful
Which are true?Which are not true but useful?
  • {Man, Man  Mortal} |= Mortal
  • {Raining,Dog Mammal} |= Mammal
  • {Raining,Raining  Wet} |= Wet
  • {Smoke, Fire  Smoke} |= Fire
  • {Tall ^ Silly} |= Tall
  • {Tall v Silly} |= Silly
  • {Tall, Silly} |= Tall ^ Silly

[Wumpus world EG illustrating possible worlds and entailment]

entailment reminder
Entailment (reminder)
  • A |= B
  • Under all interpretations in which A is true, B is true as well
  • All models of A are models of B
  • Whenever A is true, B is true as well
  • A entails B
  • B logically follows from A
inference
Inference

KB |-i A

Inference algorithm i can derive A from KB

A can be inferred from KB using i

inference algorithm examples
Inference Algorithm Examples
  • {A,B} |- (^intro) A ^ B
  • {A, AB} |- (MP) B
  • {A, B  A} |- (abduction) B
  • Internet says P|- (gullibility) P
  • Professor says P |- (trust in authority) P
inference algorithms
Inference Algorithms
  • Notes:
    • implication, piece of syntax

|= entailment, used to describe semantics

|- can be derived from*

Inference algorithm, inference procedure, rule of inference, inference rule: procedure that derives sentences from sentences

  • [Definition of soundness of inference algorithm i]
  • [Definition of completeness of inference algorithm i]
  • * Though it can mean different things in different math/logic contexts; see “turnstile” in Wikipedia
monotonicity
Monotonicity
  • [A logic is monotonic if…]
propositional logic syntax
Propositional Logic Syntax
  • Sentence -> AtomicSent | complexSent

AtomicSent -> true|false|P, Q, R …

ComplexSent ->

sentence |

( sentencesentence ) |

( sentencesentence ) |

( sentencesentence ) |

( sentence sentence ) |

( sentence )

[no predicate or function symbols]

propositional logic sentences
Propositional Logic Sentences
  • If there is a pit at [1,1], there is a breeze at [1,0]

P11  B10

  • There is a breeze at [2,2], if and only if there is a pit in the neighborhood

B22  ( P21  P23  P12  P32 )

  • There is no breeze at [2,2]

B22

semantics of prop logic
Semantics of Prop Logic
  • In model-theoretic semantics, an interpretation assigns elements of the world to sentences, and defines the truth values of sentences
  • Propositional logic: easy! Assign T or F to each proposition symbol; then assign truth values to complex sentences in the obvious way
logical equivalences
Logical Equivalences
  • Sentences A and B are logically equivalent if:
  • they are true under exactly the same interpretations
  • A |= B and B |= A
validity
Validity
  • A sentence (or set of sentences) is valid if
  • it is true under all interpretations
  • Example: P v ~P
satisfiability
Satisfiability
  • A sentence (or set of sentences) is satisfiable if there exists some interpretation that makes it true
  • An interpretation satisfies a set of sentences if it makes them true
entailment
Entailment
  • A |= B
  • All interpretations that satisfy A also satisfy B
propositional logic inference
Propositional Logic Inference
  • Question: Does KB entail S?
  • Method 1: Truth Table Entailment
  • Method 2: Proof
    • Proof by deduction, induction, contradiction, etc.
slide25

A^C, C

does not entail

BC

A,B, Entails

AB

example proof by deduction
Example Proof by Deduction
  • Knowledge

S1: B22  ( P21  P23  P12  P32 ) rule

S2: B22 observation

  • Inferences

S3: (B22  (P21  P23  P12  P32 )) ((P21  P23  P12  P32 )  B22) S1,bi elim

S4: ((P21  P23  P12  P32 )  B22) S3, and elim

S5: (B22( P21  P23  P12  P32 )) S4,contrapos

S6: (P21  P23  P12  P32 ) S2,S5, MP

S7: P21  P23  P12  P32 S6, DeMorg

proofs
Proofs
  • A derivation
  • A sequence of applications of rules of inference
  • Reasoning by search
  • Successor function: all possible applications of inference rules
  • Monotonicity means search can be local, and more efficient
resolution
Resolution
  • Resolution allows a complete inference mechanism (search-based) using only one rule of inference
resolution1
Resolution
  • winter v summer
  • ~winter v cold
  • Either winter or ~winter is true, so we know that summer or cold is true
  • Resolution rule:
    • Given: P1  P2  P3 … Pn, and P1  Q1 … Qm
    • Conclude: P2  P3 … Pn  Q1 … Qm

Complementary literals P1 and P1 “cancel out”

resolution in wumpus world
Resolution in Wumpus World
  • There is a pit at 2,1 or 2,3 or 1,2 or 3,2
    • P21  P23  P12  P32
  • There is no pit at 2,1
    • P21
  • Therefore (by resolution) the pit must be at 2,3 or 1,2 or 3,2
    • P23  P12  P32
resolution2
Resolution
  • Any complete search algorithm, applying only the resolution rule, can derive any conclusion entailed by any KB in propositional logic.
proof using resolution
Proof using Resolution
  • To try to prove P from KB:
    • Convert KB and P into CNF
    • To prove P, prove KB  P is contradictory (empty clause)
    • Specifically, apply resolution using a complete search algorithm until:
      • No new clauses can be added, (KB does not entail P)
      • The empty clause is derived (KB does entail P).
b 22 p 21 p 23 p 12 p 32 conversion to cnf
B22  ( P21  P23  P12  P32 )Conversion to CNF
  • Eliminate , replacing with two implications

(B22  ( P21  P23  P12  P32 ))  ((P21  P23  P12  P32 )  B22)

  • Replace implication (A  B) by A B

(B22  ( P21  P23  P12  P32 ))  ((P21  P23  P12  P32 )  B22)

  • Move  “inwards” (unnecessary parens removed)

(B22  P21  P23  P12  P32 )  ((P21  P23  P12  P32 )  B22)

4. Distributive Law

(B22  P21  P23  P12  P32 )  (P21  B22)  (P23  B22) (P12  B22) (P32  B22)

previous slide sentence is in cnf
Previous Slide: Sentence is in CNF
  • Next step (with simpler example):
  • (P1 v P2 v ~P3) ^ P4 ^ ~P5 ^ (P2 v P3)
  • Create a separate clause corresponding to each conjunct
    • P1 v P2 v ~P3
    • P4
    • ~P5
    • P2 v P3
finally
Finally…
  • Add the negation of the goal to the set of clauses, and perform resolution. If you reach the empty clause, you have proved the goal
simple resolution eg
Simple Resolution EG
  • When the agent is in 1,1, there is no breeze, so there can be no pits in neighboring squares
  • (B11  (P12 v P21)); ~B11
  • Prove: ~P12. [lecture]
  • [What if P and ~P are both in the KB?]
horn clauses
Horn Clauses
  • A Horn Clause is a CNF clause with at most one positive literal
  • Horn Clauses form the basis of forward and backward chaining
  • The Prolog language is based on Horn Clauses
  • Deciding entailment with Horn Clauses is linear in the size of the knowledge base
reasoning with horn clauses
Reasoning with Horn Clauses
  • Forward Chaining
    • For each new piece of data, generate all new facts, until the desired fact is generated
    • Data-directed reasoning
  • Backward Chaining
    • To prove the goal, find a clause that contains the goal as its head, and prove the body recursively
    • Goal-directed reasoning
  • The state space is an AND-OR graph; see 7.5.4
wrap up
Wrap-up
  • You are responsible for everything in Chapter 7 through 7.6-2
    • Note: we didn’t cover 7.5.4 or 7.6 in lecture
  • We will return to topics in 7.7 when we cover planning