1 / 27

CHAPTER 8 Decision Making CS267

CHAPTER 8 Decision Making CS267. BY GAYATRI BOGINERI (12) & BHARGAV VADHER (14) October 22, 2007. Outline of the chapter. Introduction Core and Reduct Simplification of Decision Table Decision Algorithm Degree of Dependency. Introduction. Definition:

alize
Download Presentation

CHAPTER 8 Decision Making CS267

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CHAPTER 8Decision MakingCS267 BY GAYATRI BOGINERI (12) & BHARGAV VADHER (14) October 22, 2007

  2. Outline of the chapter Introduction Core and Reduct Simplification of Decision Table Decision Algorithm Degree of Dependency

  3. Introduction Definition: For given KR system such that K = (U,A), we can define decision table as follows T = (U,A,C,D) where U = universe A = set of actions C = condition attributes D = decision attributes C,D is subset of A This decision table is also called ‘ CD – decision table’.

  4. Core and Reduct Core is a condition attribute value which is indispensable that means it discern the value of decision attribute. Reduct is a decision rule which must satisfy following conditions 1. The rule must be true or consistent. 2. Predecessor of rule must be independent.

  5. Simplification of decision table It means removal of unnecessary conditions to make the decision. We will show it with the following steps listed. Check dependency of conditionattribute  decision attribute Reduction of condition attribute, if any. Find core values of all decision rules. Find reducts of each decision rule. Combine and assign unique rule number to each rule.

  6. Let take the example of optician’s decision of whether the patient is suitable for contact lens or not. We have set of condition a, b, c and d. which is shown below a. Age 1. young 2. pre-presbyopic 3. prebyopic b. Spectacle 1. myope condition 2. hypermetrope attributes c. astigmatic 1. no 2. yes d. tear production rate 1. reduced 2. normal

  7. Based on given condition the optician has to take one out three of the following decision. 1. hard contact lens 2. soft contact lens decision attributes 3. no contact lens So we have total of 24 (3*2*2*2) possible combination of given conditions which are shown with their decision in tabular form below.

  8. Check dependency of conditionattribute  decision attribute Reduction of condition attribute, if any. Find core values of all decision rules. Find reducts of each decision rule. Combine and assign unique rule number to each rule.

  9. Check dependency of conditionattribute  decision attribute every set of condition attribute must have unique decision attribute. If the above shown condition is true for all decision rule then we say that dependency condition attribute  decision attribute is valid. Logically it is represented as below {a,b,c,d} == > {e}

  10. Check dependency of conditionattribute  decision attribute Reduction of condition attribute, if any. Find core values of all decision rules. Find reducts of each decision rule. Combine and assign unique rule number to each rule.

  11. Reduction of condition attribute, if any. Now check for each condition attribute whether it is dispensable or indispensable with respect to decision attribute. Suppose we are checking for condition attribute a in a given example. Remove a and look for the rest of the condition attribute.(shown in figure) After removal of a we have two pairs of decision rule which are inconsistent. 1. b2c2d2  e1 (rule 2) b2c2d2  e3 (rule 18 and 24) 2. b1c1d2  e2 (rule 5) b1c1d2  e3 (rule 20) So condition attribute a is indispensable that means not removable. Checking every condition attribute we found that every attribute is indispensable so no one is removable. So now we can say that given table is e-independent.

  12. After removal of condition attribute a

  13. Check dependency of conditionattribute  decision attribute Reduction of condition attribute, if any. Find core values of all decision rules. Find reducts of each decision rule. Combine and assign unique rule number to each rule.

  14. Find core values of all decision rules. We have to find out the core values of each decision rule in order to decide the reducts of that rule, because core is an intersection of all reduct of same rule. Consider the condition attribute and rules associated with decision attribute 1 that is e1. If we want to find out the core of first rule that is a1b1c2d2  e1 First remove A and check for rest of attribute, you find that b1c2d2  e1 (rule 1) and b1c2d2  e1 (rule 3) which are inconsistent rule under decision attribute e1. So attribute A of rule 1 is not core. Similarly B is also not core but C and D are core attribute for rule 1.

  15. Check dependency of conditionattribute  decision attribute Reduction of condition attribute, if any. Find core values of all decision rules. Find reducts of each decision rule. Combine and assign unique rule number to each rule.

  16. Find reducts of each decision rule. We have to add to the core value of each decision rule, such value of condition attribute of that rule so the 1. Predecessor of rule is independent. 2. The rule is true. Suppose we are checking for first rule that is a1b1c2d2  e1 We already know that first rule has two core that is C and D. i.e. U A B C D E 1 _ _ 2 2 1 So here we have three possibility of reduct which are 1 X 1 2 2 1 i.e. Rule b1c2d2  e1 possibility for 1 1 X 2 2 1 i.e. Rule a1c2d2  e1reducts of 1 X X 2 2 1 i.e. Rule c2d2  e1 first rule Now check for every rule whether it is true or not.

  17. Check dependency of conditionattribute  decision attribute Reduction of condition attribute, if any. Find core values of all decision rules. Find reducts of each decision rule. Combine and assign unique rule number to each rule.

  18. Combine and assign unique rule number to each rule. In the last step we have to categorize the reducts which have the same decision rule. And then assign a unique number to each of these decision rule, and it is your minimal decision table.

  19. So after categorize the rules it will look like this

  20. And finally our minimal decision table will be as follows

  21. Decision algorithm For making decision algorithm from decision table follow two steps. 1. replace all the value into the minimal decision table. 2. combine all the rules as per particular decision attribute. From given minimal decision table we have the following rules. a1c2d2  e1 b1c2d2  e1 a1c1d2  e2 a2c1d2  e2 b2c1d2  e2 d1  e3 a2b2c2  e3 a3b1c1  e3 a3b2c2  e3

  22. Now combine the rules as per decision attribute, we will have decision algorithm By combining rules we have (a1 V b1)c2d2  e1 (a1 V a2 V b2)c1d2  e2 d1 V (a3b2c1) V ((a2 V a3)b2c2)  e3. Which is our decision algorithm.

  23. Degree of dependency Degree of dependency can be calculated with the formulae number of unaffected rule with lack of this attribute degree of dependency = ---------------------------------------------------------------------- total number of rules degree of dependency of a = 19 / 24 i.e. 5 rule affected (2,5,18,20,24) degree of dependency of b = 3/4i.e. 6 rule affected degree of dependency of c = 1/2i.e. 13 rule affected degree of dependency of d = 1/4i.e. 18 rule affected So attribute d is the most significant one in our decision table, because with the lack of attribute d, 18 rules are affected.

  24. Q & A

More Related