1 / 22

Representing knowledge with rules

Representing knowledge with rules. Points Definitions Production systems Conflict resolution strategies Examples A financial advisor Bagger XCON Animal classification in Prolog A forward-chaining engine Animal classification revisited. Rules. A rule, in its simplest form, is a pair

nonnie
Download Presentation

Representing knowledge with rules

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Representing knowledge with rules • Points • Definitions • Production systems • Conflict resolution strategies • Examples • A financial advisor • Bagger • XCON • Animal classification in Prolog • A forward-chaining engine • Animal classification revisited

  2. Rules... • A rule, in its simplest form, is a pair • conditions  actions • An action can be real (maybe software-simulated), or it can only add -- assert -- new conditions. • A rule is not the same as an implication • premises  conclusions • where false premises are allowed, but there are significant similarities. • In rule-based systems, the basic rule form is enhanced with control information and, especially, with probabilities or confidence measures.

  3. ... and applications • One class of applications are rule-based expert systems. • Despite their apparent simplicity, rules are quite expressive. There are two types of uses. • Synthesis (to construct or configure): • XCON is an example of a deployed system; • our simple case study is Bagger. • Analysis (to recognize or diagnose): • MYCIN is the best-known (almost) success story; • our simple case study is animal classification.

  4. Forward-chaining production systems • Rule memory (rules are called productions). • Working memory (literals that represent the current state of the problem); actions change working memory. • Rule selection and application mechanisms (rules are indexed by situations). Rule selection relies on pattern matching. A conflict arises if more than one rule has been selected to match the current situation. Conflict resolution procedures select one rule. This rule is fired if its conditions hold. Actions are performed, working memory updated.

  5. Some conflict resolution strategies • Order rules (partially) by specificity -- the more conditions, the more specific the rule; prefer more specific rules. • Rank all rules; prefer rules with a higher rank. • Rank all facts; prefer rules with more highly ranking facts. • Record the time of changes to working memory; prefer rules that depend of recent additions.

  6. Implementing forward chaining • An agenda is a handy data structure for implementing forward chaining. It is "a list of things to do”, here -- candidate rules. • Conditions are considered in some order; any rule that has the currently examined condition in its left-hand side becomes partially activated. Its copy, with the matched condition removed, goes on the agenda. • A rule is selected for firing when it has become fully activated, that is, when all its left-hand side conditions have been matched. • Otherwise the rule is kept for future use, just in case the remaining conditions become asserted later.

  7. A financial “expert system” • saved(low)  investIn(savings) • saved(high) ∧ income(adequate) investIn(stocks) • saved(high) ∧ income(inadequate) investIn(mixed) • ∀x amountSaved(x) ∃y (nbrDependents(y) x > y*5000)  saved(high) • ∀x amountSaved(x) ∃y (nbrDependents(y) x ≤ y*5000)  saved(low) • ∀x earnings(x, steady) ∃y (nbrDependents(y) x > 15000 + y*4000)  income(adequate) • ∀x earnings(x, steady) ∃y (nbrDependents(y) x ≤ 15000 + y*4000)  income(inadequate) • ∀x earnings(x, unsteady)  income(inadequate)

  8. A financial “expert system” (2) • amountSaved(22000) • nbrDependents(3) • earnings(25000, steady) Given facts 9-11 and rules 1-8, we perform forward chaining. [9] and [10] match the conditions of [4] and [5], but only [4] passes the test. Add a new fact: saved(high) [10] and [11] match the conditions of [6] and [7], but only [7] passes the test. Add a new fact: income(inadequate) [12] and [13] match the conditions of [3]. Add a new fact: investIn(mixed) Done.

  9. Bagger: a robot that bags groceries • The main idea is simple. • Bag the large items: • put large heavy items one per bag; • start a new bag when a large item does not fit in any partially filled bag. • Bag the mid-sized items: • meat and frozen goods go in freezer bags; • start a new bag when a mid-sized item does not fit in any partially filled bag. • Bag the small items: • try to avoid bagging them with heavy items; • start a new bag when a small item does not fit in any partially filled bag. • What could be a good conflict resolution strategy for the following rules?

  10. Bagger (2) • (1) the step is bag-large &there is a large unbagged item &there is a large unbagged heavy item &there is a bag with < 6 large items bag the large heavy item • (2) the step is bag-large &there is a large unbagged item &there is a bag with < 6 large items bag the large item • (3) the step is bag-large &there is a large unbagged item start a new bag & bag the large item • (4) the step is bag-large end the step bag-large & begin the step bag-medium

  11. Bagger (3) • (5) the step is bag-medium &there is a medium unbagged item &the medium item is a freezer-bag item &there is an unfilled bag put the medium item in a freezer bag & bag the medium item • (6) the step is bag-medium &there is a medium unbagged item &there is an unfilled bag bag the medium item • (7) the step is bag-medium &there is a medium unbagged item start a new bag & bag the medium item • (8) the step is bag-medium end the step bag-medium & begin the step bag-small

  12. Bagger (4) • (9) the step is bag-small &there is a small unbagged item &there is an unfilled bag without large heavy items bag the small item • (10) the step is bag-small &there is a small unbagged item &there is an unfilled bag bag the small item • (11) the step is bag-small &there is a small unbagged item start a new bag & bag the small item • (12) the step is bag-small end the step bag-medium & stop

  13. XCON • XCON configures DEC computing equipment. It is one of rather few (but well known!) examples of commercially successful expert systems. • XCON is, in a sense, Bagger's big brother. Its general plan of action is quite simple: a sequence of steps. (There are, however, dozens of rules for each step...) • The configuring operation includes selecting components to match the order, laying out the spatial arrangement of cabinets, filling the cabinets, and so on.

  14. XCON (2) • Major steps • Check the order, look for missing or incompatible items. • Lay out the processor in cabinets. • Put boxes in the input/output cabinets, put components in those boxes. • Put panels in the input/output cabinets. • Lay out the floor plan. • Plan the cabling.

  15. A classification system in Prolog • http://www.site.uottawa.ca/~nat/Courses/csi4106_2008/Material/animals.pl

  16. Animal classificationin a rule-based system The general rules • http://www.site.uottawa.ca/~nat/Courses/csi4106_2008/Material/animals_rules

  17. Animal classification (2) • A session with the forward chainer http://www.site.uottawa.ca/~nat/Courses/csi4106_2008/Material/animals_session

  18. Animal classification (3) The forward chainer in Prolog http://www.site.uottawa.ca/~nat/Courses/csi4106_2008/Material/forward_chainer.pl

  19. A glimpse at expert systems • Your reading for independent study : sections 8.1-8.2. Architecture of a typical expert system

  20. A glimpse at expert systems (2) Guidelines to determine whether a problem is appropriate for expert system solution: The need for the solution justifies the cost and effort of building an expert system. Human expertise is not available in all situations where it is needed. The problem may be solved using symbolic reasoning. The problem domain is well structured and does not require commonsense reasoning. The problem may not be solved using traditional computing methods. Cooperative and articulate experts exist. The problem is of proper size and scope.

  21. A glimpse at expert systems (3) Exploratory development cycle

  22. Later or much later • More topics appear in the very useful chapter 8: • planning (section 8.4) will be discussed later; • case-based reasoning is left for another course .

More Related