chapter 8 decision making cs267
Download
Skip this Video
Download Presentation
CHAPTER 8 Decision Making CS267

Loading in 2 Seconds...

play fullscreen
1 / 27

CHAPTER 8 Decision Making CS267 - PowerPoint PPT Presentation


  • 105 Views
  • Uploaded on

CHAPTER 8 Decision Making CS267. BY GAYATRI BOGINERI (12) & BHARGAV VADHER (14) October 22, 2007. Outline of the chapter. Introduction Core and Reduct Simplification of Decision Table Decision Algorithm Degree of Dependency. Introduction. Definition:

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' CHAPTER 8 Decision Making CS267' - alize


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
chapter 8 decision making cs267

CHAPTER 8Decision MakingCS267

BY

GAYATRI BOGINERI (12)

&

BHARGAV VADHER (14)

October 22, 2007

outline of the chapter

Outline of the chapter

Introduction

Core and Reduct

Simplification of Decision Table

Decision Algorithm

Degree of Dependency

introduction

Introduction

Definition:

For given KR system such that K = (U,A), we can define decision table as follows

T = (U,A,C,D)

where U = universe

A = set of actions

C = condition attributes

D = decision attributes

C,D is subset of A

This decision table is also called ‘ CD – decision table’.

core and reduct

Core and Reduct

Core is a condition attribute value which is indispensable that means it discern the value of decision attribute.

Reduct is a decision rule which must satisfy following conditions

1. The rule must be true or consistent.

2. Predecessor of rule must be independent.

simplification of decision table

Simplification of decision table

It means removal of unnecessary conditions to make the

decision.

We will show it with the following steps listed.

Check dependency of conditionattribute  decision attribute

Reduction of condition attribute, if any.

Find core values of all decision rules.

Find reducts of each decision rule.

Combine and assign unique rule number to each rule.

slide6

Let take the example of optician’s decision of whether the patient is suitable for contact lens or not.

We have set of condition a, b, c and d. which is shown below

a. Age

1. young

2. pre-presbyopic

3. prebyopic

b. Spectacle

1. myope condition

2. hypermetrope attributes

c. astigmatic

1. no

2. yes

d. tear production rate

1. reduced

2. normal

slide7

Based on given condition the optician has to take one out three of the following decision.

1. hard contact lens

2. soft contact lens decision attributes

3. no contact lens

So we have total of 24 (3*2*2*2) possible combination of given conditions which are shown with their decision in tabular form below.

slide9

Check dependency of conditionattribute  decision attribute

Reduction of condition attribute, if any.

Find core values of all decision rules.

Find reducts of each decision rule.

Combine and assign unique rule number to each rule.

check dependency of condition attribute decision attribute

Check dependency of conditionattribute  decision attribute

every set of condition attribute must have unique decision attribute.

If the above shown condition is true for all decision rule then we say that dependency condition attribute  decision attribute is valid.

Logically it is represented as below

{a,b,c,d} == > {e}

slide11

Check dependency of conditionattribute  decision attribute

Reduction of condition attribute, if any.

Find core values of all decision rules.

Find reducts of each decision rule.

Combine and assign unique rule number to each rule.

reduction of condition attribute if any

Reduction of condition attribute, if any.

Now check for each condition attribute whether it is dispensable or indispensable with respect to decision attribute.

Suppose we are checking for condition attribute a in a given example.

Remove a and look for the rest of the condition attribute.(shown in figure)

After removal of a we have two pairs of decision rule which are inconsistent.

1. b2c2d2  e1 (rule 2)

b2c2d2  e3 (rule 18 and 24)

2. b1c1d2  e2 (rule 5)

b1c1d2  e3 (rule 20)

So condition attribute a is indispensable that means not removable.

Checking every condition attribute we found that every attribute is indispensable so no one is removable.

So now we can say that given table is e-independent.

slide14

Check dependency of conditionattribute  decision attribute

Reduction of condition attribute, if any.

Find core values of all decision rules.

Find reducts of each decision rule.

Combine and assign unique rule number to each rule.

find core values of all decision rules

Find core values of all decision rules.

We have to find out the core values of each decision rule in order to decide the reducts of that rule, because

core is an intersection of all reduct of same rule.

Consider the condition attribute and rules associated with decision attribute 1 that is e1.

If we want to find out the core of first rule that is a1b1c2d2  e1

First remove A and check for rest of attribute, you find that

b1c2d2  e1 (rule 1) and b1c2d2  e1 (rule 3)

which are inconsistent rule under decision attribute e1.

So attribute A of rule 1 is not core. Similarly B is also not core but C and D are core attribute for rule 1.

slide17

Check dependency of conditionattribute  decision attribute

Reduction of condition attribute, if any.

Find core values of all decision rules.

Find reducts of each decision rule.

Combine and assign unique rule number to each rule.

find reducts of each decision rule

Find reducts of each decision rule.

We have to add to the core value of each decision rule, such value of condition attribute of that rule so the

1. Predecessor of rule is independent.

2. The rule is true.

Suppose we are checking for first rule that is a1b1c2d2  e1

We already know that first rule has two core that is C and D. i.e.

U A B C D E

1 _ _ 2 2 1

So here we have three possibility of reduct which are

1 X 1 2 2 1 i.e. Rule b1c2d2  e1 possibility for

1 1 X 2 2 1 i.e. Rule a1c2d2  e1reducts of

1 X X 2 2 1 i.e. Rule c2d2  e1 first rule

Now check for every rule whether it is true or not.

slide20

Check dependency of conditionattribute  decision attribute

Reduction of condition attribute, if any.

Find core values of all decision rules.

Find reducts of each decision rule.

Combine and assign unique rule number to each rule.

combine and assign unique rule number to each rule

Combine and assign unique rule number to each rule.

In the last step we have to categorize the reducts which have the same decision rule.

And then assign a unique number to each of these decision rule, and it is your minimal decision table.

decision algorithm

Decision algorithm

For making decision algorithm from decision table follow two steps.

1. replace all the value into the minimal decision table.

2. combine all the rules as per particular decision attribute.

From given minimal decision table we have the following rules.

a1c2d2  e1

b1c2d2  e1

a1c1d2  e2

a2c1d2  e2

b2c1d2  e2

d1  e3

a2b2c2  e3

a3b1c1  e3

a3b2c2  e3

now combine the rules as per decision attribute we will have decision algorithm

Now combine the rules as per decision attribute, we will have decision algorithm

By combining rules we have

(a1 V b1)c2d2  e1

(a1 V a2 V b2)c1d2  e2

d1 V (a3b2c1) V ((a2 V a3)b2c2)  e3.

Which is our decision algorithm.

degree of dependency

Degree of dependency

Degree of dependency can be calculated with the formulae

number of unaffected rule with lack of this attribute

degree of dependency = ----------------------------------------------------------------------

total number of rules

degree of dependency of a = 19 / 24 i.e. 5 rule affected (2,5,18,20,24)

degree of dependency of b = 3/4i.e. 6 rule affected

degree of dependency of c = 1/2i.e. 13 rule affected

degree of dependency of d = 1/4i.e. 18 rule affected

So attribute d is the most significant one in our decision table, because with the lack of attribute d, 18 rules are affected.

ad