1 / 38

CPSC 433 : Artificial Intelligence Tutorials T01 & T02

CPSC 433 : Artificial Intelligence Tutorials T01 & T02. Andrew Kuipers amkuiper@cpsc.ucalgary.ca Please include [CPSC433] in the subject line of any emails regarding this course. Expert Systems. Designed to function similar to a human expert operating within a specific problem domain

Download Presentation

CPSC 433 : Artificial Intelligence Tutorials T01 & T02

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CPSC 433 : Artificial Intelligence Tutorials T01 & T02 Andrew Kuipers amkuiper@cpsc.ucalgary.ca Please include [CPSC433] in the subject line of any emails regarding this course. CPSC 433 Artificial Intelligence

  2. Expert Systems • Designed to function similar to a human expert operating within a specific problem domain • Used to: • Provide an answer to a certain problem, or • Clarify uncertainties where normally a human expert would be consulted • Often created to operate in conjunction with humans working within the given problem domain, rather than as a replacement for them. CPSC 433 Artificial Intelligence

  3. Components of an Expert Systems • Knowledge Base • Stores knowledge used by the system, usually represented in a formal logical manner Knowledge Base CPSC 433 Artificial Intelligence

  4. Components of an Expert Systems Knowledge Base Stores knowledge used by the system, usually represented in a formal logical manner Inference System Defines how existing knowledge may be used to derive new knowledge Inference System Knowledge Base CPSC 433 Artificial Intelligence

  5. Components of an Expert Systems Knowledge Base Stores knowledge used by the system, usually represented in a formal logical manner Inference System Defines how existing knowledge may be used to derive new knowledge Search Control Determines which inference to apply at a given stage of the deduction Search Control Inference System Knowledge Base CPSC 433 Artificial Intelligence

  6. Components of an Expert Systems Knowledge Base Stores knowledge used by the system, usually represented in a formal logical manner Inference System Defines how existing knowledge may be used to derive new knowledge Search Control Determines which inference to apply at a given stage of the deduction Search Control Inference System Knowledge Base CPSC 433 Artificial Intelligence

  7. Knowledge Representation • For now, we’ll use a simple If … then … consequence relation using English semantics • ie: If [it is raining] Then [I should wear a coat] • [it is raining] is the antecedent of the relation • [I should wear a coat] is the consequent of the relation • Facts can be understood as consequence relations with an empty antecedent • ie: “If [] Then [it is raining]” is equivalent to the fact that [it is raining] CPSC 433 Artificial Intelligence

  8. Inferring New Knowledge • New knowledge can be constructed from existing knowledge using inference rules • For instance, the inference rule modus ponens can be used to derive the consequent of a consequence relation, given that the antecedent is true • ie: • k1: If [it is raining] Then [I should wear a coat] • k2: [it is raining] • result: [I should wear a coat] CPSC 433 Artificial Intelligence

  9. Goal Directed Reasoning • Inference rules are applied to knowledge base in order to achieve a particular goal • The goal in an expert system is formed as a question, or query, to which we want the answer • ie: [I should wear a coat]? • note: this would read easier in English as “should I wear a coat”, but we want to use the same propositional symbol as is in our knowledge base • The goal of the search is to determine an answer to the query, which may be boolean as above or more complex CPSC 433 Artificial Intelligence

  10. Forward Chaining • Forward chaining is a data driven method of deriving a particular goal from a given knowledge base and set of inference rules • Inference rules are applied by matching facts to the antecedents of consequence relations in the knowledge base • The application of inference rules results in new knowledge (from the consequents of the relations matched), which is then added to the knowledge base CPSC 433 Artificial Intelligence

  11. Forward Chaining • Inference rules are successively applied to elements of the knowledge base until the goal is reached • A search control method is needed to select which element(s) of the knowledge base to apply the inference rule to at any point in the deduction CPSC 433 Artificial Intelligence

  12. Forward Chaining Example • Knowledge Base: • If [X croaks and eats flies] Then [X is a frog] • If [X chirps and sings] Then [X is a canary] • If [X is a frog] Then [X is colored green] • If [X is a canary] Then [X is colored yellow] • [Fritz croaks and eats flies] • Goal: Finding the color of Fritz. • [Fritz is colored Y]? CPSC 433 Artificial Intelligence

  13. Forward Chaining Example • Knowledge Base • If [X croaks and eats flies] • Then [X is a frog] • If [X chirps and sings] • Then [X is a canary] • If [X is a frog] • Then [X is colored green] • If [X is a canary] • Then [X is colored yellow] • [Fritz croaks and eats flies] • Goal • [Fritz is colored Y]? CPSC 433 Artificial Intelligence

  14. Forward Chaining Example • Knowledge Base • If [X croaks and eats flies] • Then [X is a frog] • If [X chirps and sings] • Then [X is a canary] • If [X is a frog] • Then [X is colored green] • If [X is a canary] • Then [X is colored yellow] • [Fritz croaks and eats flies] • Goal • [Fritz is colored Y]? CPSC 433 Artificial Intelligence

  15. Forward Chaining Example If [X croaks and eats flies] Then [X is a frog] • Knowledge Base • If [X croaks and eats flies] • Then [X is a frog] • If [X chirps and sings] • Then [X is a canary] • If [X is a frog] • Then [X is colored green] • If [X is a canary] • Then [X is colored yellow] • [Fritz croaks and eats flies] • Goal • [Fritz is colored Y]? [Fritz croaks and eats flies] [Fritz is a frog] CPSC 433 Artificial Intelligence

  16. Forward Chaining Example If [X croaks and eats flies] Then [X is a frog] • Knowledge Base • If [X croaks and eats flies] • Then [X is a frog] • If [X chirps and sings] • Then [X is a canary] • If [X is a frog] • Then [X is colored green] • If [X is a canary] • Then [X is colored yellow] • [Fritz croaks and eats flies] • [Fritz is a frog] • Goal • [Fritz is colored Y]? [Fritz croaks and eats flies] [Fritz is a frog] CPSC 433 Artificial Intelligence

  17. Forward Chaining Example If [X croaks and eats flies] Then [X is a frog] • Knowledge Base • If [X croaks and eats flies] • Then [X is a frog] • If [X chirps and sings] • Then [X is a canary] • If [X is a frog] • Then [X is colored green] • If [X is a canary] • Then [X is colored yellow] • [Fritz croaks and eats flies] • [Fritz is a frog] • Goal • [Fritz is colored Y]? [Fritz croaks and eats flies] [Fritz is a frog] ? CPSC 433 Artificial Intelligence

  18. Forward Chaining Example If [X croaks and eats flies] Then [X is a frog] • Knowledge Base • If [X croaks and eats flies] • Then [X is a frog] • If [X chirps and sings] • Then [X is a canary] • If [X is a frog] • Then [X is colored green] • If [X is a canary] • Then [X is colored yellow] • [Fritz croaks and eats flies] • [Fritz is a frog] • Goal • [Fritz is colored Y]? [Fritz croaks and eats flies] [Fritz is a frog] CPSC 433 Artificial Intelligence

  19. Forward Chaining Example If [X croaks and eats flies] Then [X is a frog] • Knowledge Base • If [X croaks and eats flies] • Then [X is a frog] • If [X chirps and sings] • Then [X is a canary] • If [X is a frog] • Then [X is colored green] • If [X is a canary] • Then [X is colored yellow] • [Fritz croaks and eats flies] • [Fritz is a frog] • Goal • [Fritz is colored Y]? [Fritz croaks and eats flies] If [X is a frog] Then [X is colored green] [Fritz is a frog] [Fritz is colored green] CPSC 433 Artificial Intelligence

  20. Forward Chaining Example If [X croaks and eats flies] Then [X is a frog] • Knowledge Base • If [X croaks and eats flies] • Then [X is a frog] • If [X chirps and sings] • Then [X is a canary] • If [X is a frog] • Then [X is colored green] • If [X is a canary] • Then [X is colored yellow] • [Fritz croaks and eats flies] • [Fritz is a frog] • [Fritz is colored green] • Goal • [Fritz is colored Y]? [Fritz croaks and eats flies] If [X is a frog] Then [X is colored green] [Fritz is a frog] [Fritz is colored green] CPSC 433 Artificial Intelligence

  21. Forward Chaining Example If [X croaks and eats flies] Then [X is a frog] • Knowledge Base • If [X croaks and eats flies] • Then [X is a frog] • If [X chirps and sings] • Then [X is a canary] • If [X is a frog] • Then [X is colored green] • If [X is a canary] • Then [X is colored yellow] • [Fritz croaks and eats flies] • [Fritz is a frog] • [Fritz is colored green] • Goal • [Fritz is colored Y]? [Fritz croaks and eats flies] If [X is a frog] Then [X is colored green] [Fritz is a frog] [Fritz is colored green] ? CPSC 433 Artificial Intelligence

  22. Forward Chaining Example If [X croaks and eats flies] Then [X is a frog] • Knowledge Base • If [X croaks and eats flies] • Then [X is a frog] • If [X chirps and sings] • Then [X is a canary] • If [X is a frog] • Then [X is colored green] • If [X is a canary] • Then [X is colored yellow] • [Fritz croaks and eats flies] • [Fritz is a frog] • [Fritz is colored green] • Goal • [Fritz is colored Y]? [Fritz croaks and eats flies] If [X is a frog] Then [X is colored green] [Fritz is a frog] [Fritz is colored green] CPSC 433 Artificial Intelligence

  23. Forward Chaining Example If [X croaks and eats flies] Then [X is a frog] • Knowledge Base • If [X croaks and eats flies] • Then [X is a frog] • If [X chirps and sings] • Then [X is a canary] • If [X is a frog] • Then [X is colored green] • If [X is a canary] • Then [X is colored yellow] • [Fritz croaks and eats flies] • [Fritz is a frog] • [Fritz is colored green] • Goal • [Fritz is colored Y]? [Fritz croaks and eats flies] If [X is a frog] Then [X is colored green] [Fritz is a frog] [Fritz is colored Y] ? [Fritz is colored green] Y = green CPSC 433 Artificial Intelligence

  24. Backward Chaining • Backward chaining is a goal drivenmethod of deriving a particular goal from a given knowledge base and set of inference rules • Inference rules are applied by matching the goal of the search to the consequents of the relations stored in the knowledge base • When such a relation is found, the antecedent of the relation is added to the list of goals (and not into the knowledge base, as is done in forward chaining) CPSC 433 Artificial Intelligence

  25. Backward Chaining • Search proceeds in this manner until a goal can be matched against a fact in the knowledge base • Remember: facts are simply consequence relations with empty antecedents, so this is like adding the ‘empty goal’ to the list of goals • As with forward chaining, a search control method is needed to select which goals will be matched against which consequence relations from the knowledge base CPSC 433 Artificial Intelligence

  26. Backward Chaining Example Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] Goals [Fritz is colored Y]? CPSC 433 Artificial Intelligence

  27. Backward Chaining Example Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] Goals [Fritz is colored Y]? CPSC 433 Artificial Intelligence

  28. Backward Chaining Example [Fritz is colored Y] Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] Goals [Fritz is colored Y]? If [X is a frog] Then [X is colored green] [X is a frog] CPSC 433 Artificial Intelligence

  29. Backward Chaining Example [Fritz is colored Y] Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] Goals [Fritz is colored Y]? [X is a frog] If [X is a frog] Then [X is colored green] [X is a frog] CPSC 433 Artificial Intelligence

  30. Backward Chaining Example [Fritz is colored Y] Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] Goals [Fritz is colored Y]? [X is a frog] If [X is a frog] Then [X is colored green] [X is a frog] CPSC 433 Artificial Intelligence

  31. Backward Chaining Example [Fritz is colored Y] Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] Goals [Fritz is colored Y]? [X is a frog] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [X is a frog] [X is a canary] CPSC 433 Artificial Intelligence

  32. Backward Chaining Example [Fritz is colored Y] Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] Goals [Fritz is colored Y]? [X is a frog] [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [X is a frog] [X is a canary] CPSC 433 Artificial Intelligence

  33. Backward Chaining Example [Fritz is colored Y] Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] Goals [Fritz is colored Y]? [X is a frog] [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [X is a frog] [X is a canary] CPSC 433 Artificial Intelligence

  34. Backward Chaining Example [Fritz is colored Y] Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] Goals [Fritz is colored Y]? [X is a frog] [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [X is a frog] [X is a canary] If [X croaks and eats flies] Then [X is a frog] [X croaks and eats flies] CPSC 433 Artificial Intelligence

  35. Backward Chaining Example [Fritz is colored Y] Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] Goals [Fritz is colored Y]? [X is a frog] [X is a canary] [X croaks and eats flies] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [X is a frog] [X is a canary] If [X croaks and eats flies] Then [X is a frog] [X croaks and eats flies] CPSC 433 Artificial Intelligence

  36. Backward Chaining Example [Fritz is colored Y] Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] Goals [Fritz is colored Y]? [X is a frog] [X is a canary] [X croaks and eats flies] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [X is a frog] [X is a canary] If [X croaks and eats flies] Then [X is a frog] [X croaks and eats flies] CPSC 433 Artificial Intelligence

  37. Backward Chaining Example [Fritz is colored Y] Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] Goals [Fritz is colored Y]? [X is a frog] [X is a canary] [X croaks and eats flies] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [X is a frog] [X is a canary] If [X croaks and eats flies] Then [X is a frog] [Fritz croaks and eats flies] [X croaks and eats flies] X = Fritz, Y = green CPSC 433 Artificial Intelligence

  38. Backward Chaining Example [Fritz is colored Y] Knowledge Base If [X croaks and eats flies] Then [X is a frog] If [X chirps and sings] Then [X is a canary] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [Fritz croaks and eats flies] Goals [Fritz is colored Y]? [X is a frog] [X is a canary] [X croaks and eats flies] If [X is a frog] Then [X is colored green] If [X is a canary] Then [X is colored yellow] [X is a frog] [X is a canary] If [X croaks and eats flies] Then [X is a frog] [Fritz croaks and eats flies] [X croaks and eats flies] X = Fritz, Y = green CPSC 433 Artificial Intelligence

More Related