1 / 63

ITCS452: Rule-based Reasoning

ITCS452: Rule-based Reasoning. Dr. Jarernsri L. Mitrpanont. Techniques supporting Inference process. A tree is a hierarchical data structure consisting of: Nodes – store information Branches – connect the nodes The top node is the root, occupying the highest hierarchy.

lacey
Download Presentation

ITCS452: Rule-based Reasoning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ITCS452:Rule-based Reasoning Dr. Jarernsri L. Mitrpanont

  2. Techniques supporting Inference process • A tree is a hierarchical data structure consisting of: • Nodes – store information • Branches – connect the nodes • The top node is the root, occupying the highest hierarchy. • The leaves are at the bottom, occupying the lowest hierarchy. • A graph can have zero or more links between nodes – there is no distinction between parent and child. • Sometimes links have weights – weighted graph; or, arrows – directed graph. • A graph can have/ have no loops. • Acyclic graphs have no cycles. • Connected graphs have links to all the nodes. • Digraphs are graphs with directed links. • Lattice is a directed acyclic graph.

  3. Decision Tree Decision Trees • A decision tree is a tree in which each branch node represents a choice between a number of alternatives, and each leaf node represents a classification or decision. • For example, we might have a decision tree to help a financial institution decide whether a person should be offered a loan: A decision tree is induced from a set of data including instances together with the decisions or classifications for those instances.

  4. Binary Decision Trees • Every question takes us down one level in the tree. • A binary decision tree having N nodes: • All leaves will be answers. • All internal nodes are questions. • There will be a maximum of 2N answers for N questions. • Decision trees can be self learning. • Decision trees can be translated into production rules.

  5. State and Problem Spaces • A state space can be used to define an object’s behavior. • Different states refer to characteristics that define the status of the object. • A state space shows the transitions an object can make in going from one state to another. Using FSM to Solve Problems • A FSM is a diagram describing the finite number of states of a machine. • At any one time, the machine is in one particular state. • The machine accepts input and progresses to the next state. • FSMs are often used in compilers and validity checking programs.

  6. Some Types of Inference Methods • Analogy – relating old situations (as a guide) to new ones. • Generate-and-Test – generation of a likely solution then test to see if proposed meets all requirements. • Abduction – Fallacy of the Converse • Nonmonotonic Reasoning – theorems may not increase as the number of axioms increase.

  7. Rule-based Systems(Production Systems) Rules are an important knowledge representation paradigm, which is represented in IF-THEN format. IF is a condition/premise/ antecedent (test for the truth value of a set of facts.) THEN is an action/conclusion/ consequent (is inferred as a new set of facts if the IF part is true.)

  8. Architecture of Production System Inference Engine Interpreter Knowledge base Domain Declaration Production Rules Working Area (Memory) User Interface Production system Model • Cognitive psychology of problem solving (2 different memory organization:) • Short-term memory contain limited amount of Knowledge (= Facts) about environment specific to the problem. • Long-term memory general Knowledge about domain of problem & problem solving method (= Rules) seldomly change more permanent.

  9. Forward & Backward Deduction Systems The Formular representing knowledge about some universe are separated into Facts : assertions that represent specific knowledge relevant to a particular case and not expressed as implications Rules : assertions given in implication form express general knowledge about a particular subject area Forward Deduction Systems In forward systems, deduction rules are applied to FACT + RULES to derive new knowledge • the algorithm terminates when the goal formula is obtained

  10. Knowledge based Example 1.bird (x) animal (x) 2. mammal (x) animal (x) 3. robin (x) bird (x) 4. owl (x) bird (x) 5. Lion (x) mammal (x) 6. Tiger (x) mammal x) 7. robin (tweety) 8. owl (a) 9. Lion (b) 10. Tiger (c) 11. goal : Is tweety an animal?

  11. Example Knowledge based 1) bird (x) animal (x) Forward Deduction, the search space small. R7 robin (tweety) R3 robin (x) bird (x) new 12 bird (tweety) 1 bird (x) animal (x) new 13 animal (tweety) 11 animal (tweety) 14

  12. Knowledge based Example for backward chaining 1. robin (x) bird (x) 2. robin (x) small (x) 3. robin (x) red breast (x) 4. bird (x) animal (x) 5. bird (x) haswings (x) 6. redbreast (x) colourful (x) 7. redbreast (x) attractive (x) 8. small (x) not large (x) 9. small (x) not medium (x) 10. robin (tweety) 11. goal : attractive (tweety)

  13. Backward Deduction System Deduction Rulers are applied to goal + rules Terminate when all Subgoals are solved or correspond to facts. Backward deduction theorem Given logical Expressions F1, ..., Fn and G Expression G is said to be a logical sequence of F1, ..., Fn if the logical expression ( F1 V F2 V ... V Fn V G) is valid.

  14. Using backward Deduction by starting with the goal and applying rules to generate Subgoals 11 attractive (tweety) 7 redbreast (x) attractive (x) new subgoal 12 redbreast (x) 3 robin (x) redbreast (x) new subgoal 13 robin (x) 10 robin (tweety) new subgoal 14 robin (tweety) 10 robin (tweety)

  15. Is it better to reason forward or backward ? 1. Which one provides the lower branching factors in search tree ? Example 1 Forward: Fact about tweety F1 : robin (tweety) F2 : bird (tweety) F3 : animal (tweety) Backward: Goal animal (tweety) G1 : bird (x) G2 : mammal (x) G3 : robin (x) G4 : owl (x) G5 : lion (x) G6 : tiger (x) G7 : robin (tweety) F1 R3 F2 R1 F3 G R1 R2 G1 G2 R3 R4 R5 R6 G5 G4 G6 G3 R7 G7

  16. Backward G : attractive (tweety) G1 : redbreast (x) G2 : robin (x) G3 : robin (tweety) Forward: Fact about tweety F1 : robin (tweety) F2 : bird (tweety) F3 : small (tweety) F4 : redbreast (tweety) F5 : animal (tweety) F6 : haswings (tweety) F7 : not large (tweety) F8 : not medium (tweety) F9 : colorful (tweety) F10 : attractive (tweety) G Example 2 R1 G1 R3 G2 R10 G3 F1 R1 R3 R2 F2 F3 F4 R4 R5 R7 R8 R9 R6 F5 F6 F7 F8 F9 F10

  17. 2. What kind of event is going to trigger a problem solving episode ? need the arrival of new fact use forward Chaining need a response to a query for a goal use backward Chaining

  18. Some Characteristics of Forward and Backward Chaining

  19. DefinitionA production rule is a statement having the following forms; • <production rules> : : = if <antecedent> then <consequent> • <antecedent> : : = <disjunct> and <disjunct> • <disjunct> : : = <condition> or <condition> • . • . • . • .

  20. DefinitionA production rule is a statement having the following forms; (production rules) : : = if (antecedent) then (consequent) (antecedent) : : = (disjunct) {and (conditions)} (disjunct) : : = (conditions) {or (condition)} (consequent) : : = (conditions) {and (conclusion)} (condition) : : = (predicate) ((object), (attribute-value-lists)) (conclusion) : : = (action) ((object), (attribute-value-lists)) (attribute-value-lists) : : = (attribute-value) {, (attribute-value) (attribute-value) : : = ((attribute)(value) (predicate) : : = same/not same/lesst,lessteg/gret/greteq/... (action) : : = make/remove/modify/write/openfile/closefile/... (object) : : = atom (attribute) : : = atom (value) : : = (variables|constant) (variable) : : = ?x/!x In the production-rule formalism, it is assumed that the or operator has a higher precedence than the and operator.

  21. Advantages : 1. the most popular representation technique. 2. excellent to represent heuristic knowledge. 3. easy to implement and to understand. 4. common to a natural format used by experts to express knowledge in solving problem.

  22. Rule-based systems differ from logic. 1. They are nonmonotonic. 2. They accept uncertainty in the deductive process. Reasoning process Twoapproaches in reasoning process : 1. Forward chaining/ data driven • start with all known facts progress to the solution 2. Backward chaining/ goal driven • select a possible conclusion and find its facts to validate

  23. Data-driven reasoning 1. A rule is selected for fire when its premises are satisfied. 2. Progress is made forward from the data toward the solution. 3. This approach is suitable for problems involving synthesis, such as design, configuration, planning and scheduling. Goal-driven reasoning 1. A goal is selected and the system seeks to verify its validity by finding supporting facts. 2. This approach is suitable for problem domains such as diagnostic problems.

  24. Inference Chain Rule-based systems use modus ponens rule of inference via searching and pattern matching to automate and provide a progression from initial data to the desired solution. The system solved the problem by creating a series of inferences (a set of new facts) that create a “path” between initial state to the solutions. This series of inference is called inference chain.

  25. Forward Chaining Fact is collected and checked with the premise of each rule. The satisfied rules are selected to be executed. The process of checking the rules is called interpretation. Interpretation is the repetition of these steps : 1.Matching to find the matching between the known facts and the premises of the rule. 2. Conflict resolution 3. Execution (Firing) either new facts can be derived, or a new rule.

  26. Forward Reasoning Inference Process Knowledge Rules New Rules Step 1: Match Step 2: Conflict resolution Step 3: Execution Applicable Rules Selected Rule Facts New Facts Facts

  27. Fruit Classification Rule Base Rule 1: IF Shape = long and Color = green or yellow THEN Fruit = banana Rule 2 : IF Shape = round or oblong and Diameter > 4 inches THEN Fruitclass = vien Rule 3 : IF Shape = round and Diameter < 4 inches THEN Fruitclass = tree Rule 4 : IF Seedcount = 1 THEN seedclass = Stonefruit Rule 5: IF Seedcount > 1 THEN seedclass = Multiple

  28. Fruit Classification Rule Base Rule 6: IF Fruitclass = vine and Color = green THEN Fruit = watermelon Rule 7: IF Fruitclass = vine and Surface = smooth and Color = yellow THEN Fruit = honeydew Rule 8: IF Fruitclass = vine and Surface = rough and Color = tan THEN Fruit = cantaloupe Rule 9: IF Fruitclass = tree and Color = orange and Seedclass = stonefruit THEN Fruit = apricot

  29. Rule 10: IF Fruitclass = tree and Color = orange and Seedclass = multiple THEN Fruit = orange Rule 11: IF Fruitclass = tree and Color = red and Seedclass = stonefruit THEN Fruit = cherry Rule 12: IF Fruitclass = tree and Color = orange and Seedclass = stonefruit THEN Fruit = peach Rule 13: IF Fruitclass = tree and Color = red or yellow or green & Seedclass = multiple THEN Fruit = apple Rule 14: IF Fruitclass = tree and Color = purple and Seedclass = stonefruit THEN Fruit = plum Fruit Classification Rule Base

  30. Graphical Representaion of the Fruit Identification Knowledge Base shape = long Fruit = banana round oblong Diameter > 4” < 4” Fruit = watermelon Fruitclass = vine Surface = smooth rough Fruit = honeydew Fruit = cantaloupe Colour = green Fruit = apple yellow Fruit = apricot tan Fruit = cherry orange Fruit = peach red Fruit = plum purple Fruitclass = tree Fruit = orange Seedclass = multiple Seedcount > 1 = 1 Seedclass = stanefruit

  31. Data Base for Fruit Classification System Diameter = 1 inch Shape = round Seedcount = 1 Color = red

  32. Trace of Rule-based Execution Execution cycle Applicable rules Selected rule Derived fact 1 3, 4 3 Fruitclass = tree 2 3, 4 4 Seedclass = stonefruit 3 3, 4, 11 11 Fruit = cherry 4 3, 4, 11 - - A Sample Inference Network R1 : IF the ambient temperature>90 F Then the weather is hot R2 : IF the relative humidity > 65% Then the atmosphere is humid R3 : IF the weather is hot and the atmosphere is humid Then thunderstorms are likely to develop. amb-temp rel-hum R1 R2 temp-cond hum-cond storm

  33. Rule-Based Architecture Two structures to organize the knowledge in the rule-based system : 1. Inference Network 2. Pattern-matching systems Inference Network : Inference network is represented as a graph. A node represents parameters of facts or intermediate parameters or propositions. The rules are represented by the interconnections between various nodes.

  34. Backward Reasoning It starts with a desired conclusion and decides if the existing facts support the derivation of a value of this conclusion. The search consists of the following steps : 1. Form a stack composed of all of the top-level goals defined in the system. 2. Gather all rules capable to satisfy first goal. 3. Consider each rule’s premises: a) if allpremises are satisfied then fire the rule and derive the conclusion then remove this goal.

  35. b) If a premise is not satisfied, look for the rules that have this premise as conclusion. If found, set this premise as a subgoal and repeat step 2. c) If step b cannot find such a rule, then query the user for its value and add it to the database. If this value satisfies the current premise then continue, otherwise consider next rule. 4. If current goal has been satisfied then remove it from stack and continue step 2 until it is empty otherwise the current goal remains undetermined.

  36. Parameters and Rules for the Flood Warning and Evacuation Knowledge Base PARAMETER POSSIBLE VALUES month any month of the year upstream precipitation none, light, heavy weather forecast sunny, cloudy, stormy river height measurement in feet season dry, wet local rain none, light rain, heavy rain river change none, lower, higher river level low, normal, high flood warning yes, no evacuation order yes, no

  37. Rules Continued R1 IF month = may . . . oct THEN season = wet R2 IF month = may . . . april THEN season = dry R3 IF upstream = none and season = dry THEN Change = lower R4 IF upstream = none and season = wet THEN Change = none R5 IF upstream = light THEN Change = none R6 IF upstream = heavy THEN Change = higher

  38. Rules Continued R7 IF level = low THEN flood = no AND evac = no R8 IF change = none lower AND level = normal low THEN flood = no AND evac = no R9 IF change = higher AND level = normal AND THEN flood = yes (CF = 0.4) AND evac = no R10 IF change = higher AND level = normal AND rain = light THEN flood = no AND evac = no

  39. Rules Continued R11 IF change = higher AND level = higher AND rain = none light THEN flood = yes (CF = 0.5) and evac = yes (CF = 0.2) R12 IF change = higher AND level = higher AND rain = heavy THEN flood = yes and evac = yes (CF = 0.8) R13 IF higher < 10 THEN level = low

  40. Rules Continued R14 IF higher >= 10 and <= 16 THEN level = normal R15 IF higher >16 THEN level = high R16 IF forecast = sunny THEN rain = none R17 IF forecast = cloudy THEN rain = light R18 IF forecast = stormy THEN rain = heavy

  41. Inference Network for the Flood Warning and Evacuation Knowledge Base flood evac R10 R9 R12 R8 R7 R13 rain change level R3 R4 season R15 R5 R16 R18 R14 R6 R17 R13 R1 R2 forecast upstream month height

  42. Advantages of Inference Network • The inference process is performed via these interconnections. • All of the interconnections can be stated explicitly prior to run-time. • It minimizes the matching process. (facts vs. premises) • They simplify the implementation of the inference engine and the handling of explanations. • The conflict resolution problem is reduced to simply maintaining a list of newly matched rules for subsequent firing.

  43. Forward-Reasoning Inference Network Algorithm 1. Assign values to all input nodes from the external sources providing information to the knowledge-based system. 2. Form a queue, Q, containing all rules that use the values of these input nodes in their premises. 3. Until there are no more rules in Q: a. Examine the first rule in Q, comparing its premises with the values of the appropriate parameters to decide if the premises of the rule are satisfied. b. If the premises of the rule are not satisfied, remove the rule from Q and go back to a. c. If the rule is matched: i. execute the rule, setting the rule’s downstream elements to values specified by the rule, ii. decide which rules use the downstream elements just set within their permises, iii. add these rules as the last rules within Q if they are not already in Q, iv. delete the original rule from the front of the Q and return to step a. 4. Output the values of the hypotheses that have been identifed as conclusions. 5. If this application involves a real-time process control, go back to step 1 and start the process again.

  44. Pattern-matching systems They use extensive searches to match and execute the rules, deriving new facts. The system depends on matching the premises of a rule to existing facts to determine which rules have their premises satisfied by the facts and can be executed. The following are the typical features : 1.) Pattern connectives 2.) Wild card 3.) Field constraints 4.) Mathematical operators 5.) Test feature

  45. Evaluation of the Architecture Pattern Matcher : • Pattern matcher are extremely flexible and powerful. • It does not easily support reasoning with uncertainty • It is inefficient in large implementation. Inference Networks : • Useful for domains with limited number of different alternative solutions. • It is easy to implement, but less powerful. • It is easily allowed for the explanation of the solution derived by the system.

  46. Disadvantages of Rule-based Systems 1. Infinite chaining 2. Addition of new, contradictory knowledge 3. Modification of existing rules 4. Coverage of Domains More advantages 1. Modularity 2. Uniformity 3. Naturalness

  47. Conflict Resolution In the case of more than one rule are satisfied the pattern with the facts in working memory, conflict resolution strategy is needed to decide which matched rules should be executed first. • Conflict Resolution is considered as an additional heuristic • control to a search algorithm in the production rules system. • There are 4 broad categories based on the following criteria • number of rules to execute • order of the rules • complexity of the rules • order of the data

  48. Criteria I : Number of Rules to Execute • define a specific number of rules to be executed in each cycle such as • single rule • multiple rules • all rules • Criteria II : Order of the Rule • define the position of the rule in relationship to other rules such as • select the following rule • the lowest numbered rule, • the first applicable rule following the one that fired on the last cycle. • the lowest numbered rule that will derive a new fact. • Criteria III : Complexity of the rules • determine the complexity through the number of premises or • conditions such as • selection based on the most complex rule • selection based on the most general rule

  49. Criteria IV :Order of the Data • consider the order of the data such as • the oldest data • the newest data • other criteria • Example of conflict resolution in OPS 5 • Brownston et al. use 3 strategies • Refraction • Recency • Specificity • Refraction stated that rule can be executed once until the working memory • elements that match its conditions have been modified • i.e., discourage looping • Recency prefers rules whose condition match with the patterns most • recently added to working memory • i.e. it narrows down the search to a single line of reasoning • Specificity assumes more specific is preferable to a general rule

  50. ?How to improve the speed of forward-chained rule System If we have r rules , f facts and an average of P premises in each rule then we will perform comparisons to the fact base on every cycle to determine the conflict rule set r * fp EX. 150 rules, 20 facts exist, 4 premises in a rule 150*204 = 24,000,000 ! comparisons. • Based on two empirical observations : • I . Temporal Redundancy • The firing of a rule usually changes only a few facts • As a result, only a few rules are affected by each of those changes • II. Structural Similarity • The same pattern often appears in the left-hand side of more than one rule • * The inefficiency is due to being unable to determine how often a given rule can be • satisfied by the facts in the database.

More Related