Rule based multicriteria decision support using rough set approach
This presentation is the property of its rightful owner.
Sponsored Links
1 / 123

RULE-BASED MULTICRITERIA DECISION SUPPORT USING ROUGH SET APPROACH PowerPoint PPT Presentation


  • 87 Views
  • Uploaded on
  • Presentation posted in: General

RULE-BASED MULTICRITERIA DECISION SUPPORT USING ROUGH SET APPROACH. Roman S l owi n ski Laboratory of Intelligent Decision Support Systems Institute of Computing Science Poznan University of Technology.  Roman Slowinski 2006. Plan. Rough Set approach to preference modeling

Download Presentation

RULE-BASED MULTICRITERIA DECISION SUPPORT USING ROUGH SET APPROACH

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Rule based multicriteria decision support using rough set approach

RULE-BASED MULTICRITERIA DECISION SUPPORT USING ROUGH SETAPPROACH

Roman Slowinski

Laboratory of Intelligent Decision Support Systems

Institute of Computing Science

Poznan University of Technology

 Roman Slowinski 2006


Rule based multicriteria decision support using rough set approach

Plan

  • Rough Set approach to preference modeling

  • Classical Rough Set Approach (CRSA)

  • Dominance-based Rough Set Approach (DRSA) for multiple-criteria sorting

    • Granular computing with dominance cones

    • Induction of decision rules from dominance-based rough approximations

    • Examples of multiple-criteria sorting

  • Decision rule preference model vs. utility function and outranking relation

  • DRSA for multicriteria choice and ranking

    • Dominance relation for pairs of objects

    • Induction of decision rules from dominance-based rough approximations

    • Examples of multiple-criteria ranking

  • Extensions of DRSA dealing with preference-ordered data

  • Conclusions


Inconsistencies in data rough set theory

Student

Mathematics

Physics

Literature

Overall class

S1

good

medium

bad

bad

S2

medium

medium

bad

medium

S3

medium

medium

medium

medium

S4

medium

medium

medium

good

S5

good

medium

good

good

S6

good

good

good

good

S7

bad

bad

medium

bad

S8

bad

bad

medium

bad

Inconsistencies in data – Rough Set Theory

  • Zdzisław Pawlak (1926 – 2006)


Inconsistencies in data rough set theory1

Student

Mathematics (M)

Physics (Ph)

Literature (L)

Overall class

S1

good

medium

bad

bad

S2

medium

medium

bad

medium

S3

medium

medium

medium

medium

S4

medium

medium

medium

good

S5

good

medium

good

good

S6

good

good

good

good

S7

bad

bad

medium

bad

S8

bad

bad

medium

bad

Inconsistencies in data – Rough Set Theory

  • Objects with the same description are indiscernible and create blocks


Inconsistencies in data rough set theory2

Student

Mathematics (M)

Physics (Ph)

Literature (L)

Overall class

S1

good

medium

bad

bad

S2

medium

medium

bad

medium

S3

medium

medium

medium

medium

S4

medium

medium

medium

good

S5

good

medium

good

good

S6

good

good

good

good

S7

bad

bad

medium

bad

S8

bad

bad

medium

bad

Inconsistencies in data – Rough Set Theory

  • Objects with the same description are indiscernible and create blocks


Inconsistencies in data rough set theory3

Student

Mathematics (M)

Physics (Ph)

Literature (L)

Overall class

S1

good

medium

bad

bad

S2

medium

medium

bad

medium

S3

medium

medium

medium

medium

S4

medium

medium

medium

good

S5

good

medium

good

good

S6

good

good

good

good

S7

bad

bad

medium

bad

S8

bad

bad

medium

bad

Inconsistencies in data – Rough Set Theory

  • Objects with the same description are indiscernible and create blocks


Inconsistencies in data rough set theory4

Student

Mathematics (M)

Physics (Ph)

Literature (L)

Overall class

S1

good

medium

bad

bad

S2

medium

medium

bad

medium

S3

medium

medium

medium

medium

S4

medium

medium

medium

good

S5

good

medium

good

good

S6

good

good

good

good

S7

bad

bad

medium

bad

S8

bad

bad

medium

bad

Inconsistencies in data – Rough Set Theory

  • Objects with the same description are indiscernible and create blocks


Inconsistencies in data rough set theory5

Student

Mathematics (M)

Physics (Ph)

Literature (L)

Overall class

S1

good

medium

bad

bad

S2

medium

medium

bad

medium

S3

medium

medium

medium

medium

S4

medium

medium

medium

good

S5

good

medium

good

good

S6

good

good

good

good

S7

bad

bad

medium

bad

S8

bad

bad

medium

bad

Inconsistencies in data – Rough Set Theory

  • Objects with the same description are indiscernible and create blocks


Inconsistencies in data rough set theory6

Student

Mathematics (M)

Physics (Ph)

Literature (L)

Overall class

S1

good

medium

bad

bad

S2

medium

medium

bad

medium

S3

medium

medium

medium

medium

S4

medium

medium

medium

good

S5

good

medium

good

good

S6

good

good

good

good

S7

bad

bad

medium

bad

S8

bad

bad

medium

bad

Inconsistencies in data – Rough Set Theory

  • Objects with the same description are indiscernible and create granules


Inconsistencies in data rough set theory7

Student

Mathematics (M)

Physics (Ph)

Literature (L)

Overall class

S1

good

medium

bad

bad

S2

medium

medium

bad

medium

S3

medium

medium

medium

medium

S4

medium

medium

medium

good

S5

good

medium

good

good

S6

good

good

good

good

S7

bad

bad

medium

bad

S8

bad

bad

medium

bad

Inconsistencies in data – Rough Set Theory

  • Another information assigns objects to some classes(sets, concepts)


Inconsistencies in data rough set theory8

Student

Mathematics (M)

Physics (Ph)

Literature (L)

Overall class

S1

good

medium

bad

bad

S2

medium

medium

bad

medium

S3

medium

medium

medium

medium

S4

medium

medium

medium

good

S5

good

medium

good

good

S6

good

good

good

good

S7

bad

bad

medium

bad

S8

bad

bad

medium

bad

Inconsistencies in data – Rough Set Theory

  • Another information assigns objects to some classes(sets, concepts)


Inconsistencies in data rough set theory9

Student

Mathematics (M)

Physics (Ph)

Literature (L)

Overall class

S1

good

medium

bad

bad

S2

medium

medium

bad

medium

S3

medium

medium

medium

medium

S4

medium

medium

medium

good

S5

good

medium

good

good

S6

good

good

good

good

S7

bad

bad

medium

bad

S8

bad

bad

medium

bad

Inconsistencies in data – Rough Set Theory

  • Another information assigns objects to some classes(sets, concepts)


Inconsistencies in data rough set theory10

Student

Mathematics (M)

Physics (Ph)

Literature (L)

Overall class

S1

good

medium

bad

bad

S2

medium

medium

bad

medium

S3

medium

medium

medium

medium

S4

medium

medium

medium

good

S5

good

medium

good

good

S6

good

good

good

good

S7

bad

bad

medium

bad

S8

bad

bad

medium

bad

Inconsistencies in data – Rough Set Theory

  • The granules of indiscernible objects are used to approximate classes


Inconsistencies in data rough set theory11

Student

Mathematics (M)

Physics (Ph)

Literature (L)

Overall class

S1

good

medium

bad

bad

S2

medium

medium

bad

medium

S3

medium

medium

medium

medium

S4

medium

medium

medium

good

S5

good

medium

good

good

Lower Approximation

S6

good

good

good

good

S7

bad

bad

medium

bad

S8

bad

bad

medium

bad

Inconsistencies in data – Rough Set Theory

  • Lower approximation ofclass „good”


Inconsistencies in data rough set theory12

Student

Mathematics (M)

Physics (Ph)

Literature (L)

Overall class

Upper Approximation

S1

good

medium

bad

bad

S2

medium

medium

bad

medium

S3

medium

medium

medium

medium

S4

medium

medium

medium

good

S5

good

medium

good

good

Lower Approximation

S6

good

good

good

good

S7

bad

bad

medium

bad

S8

bad

bad

medium

bad

Inconsistencies in data – Rough Set Theory

  • Lower and upper approximation ofclass „good”


Crsa decision rules induced from rough approximations

CRSA – decision rules induced from rough approximations

  • Certain decision rule supported by objects from lower approximation of Clt (discriminant rule)

  • Possible decision rule supported by objects from upper approximation of Clt (partly discriminant rule)

R.Słowiński, D.Vanderpooten: A generalized definition of rough approximations based on similarity.

IEEE Transactions on Data and Knowledge Engineering, 12 (2000) no. 2, 331-336


Crsa decision rules induced from rough approximations1

CRSA – decision rules induced from rough approximations

  • Certain decision rule supported by objects from lower approximation of Clt (discriminant rule)

  • Possible decision rule supported by objects from upper approximation of Clt (partly discriminant rule)

  • Approximate decision rule supported by objects from theboundaryof Clt

    where Clt,Cls,...,Clu are classes to which belong inconsistent objects supporting this rule


Rough set approach to preference modeling

Rough Set approach to preference modeling

  • The preference information has the form of decision examplesand the preference model is a set ofdecision rules

  • “People make decisions by searching for rules that provide good justification of their choices” (Slovic, 1975)

  • Rules make evidence of a decision policy and can be used for both explanation of past decisions &recommendation of future decisions

  • Construction of decision rules is done by induction (kind of regression)

  • Induction is a paradigm of AI: machine learning, data mining, knowledge discovery

  • Regression aproach has also been used for other preference models:

    • utility (value) function, e.g. UTA methods, MACBETH

    • outranking relation, e.g. ELECTRE TRI Assistant


Rough set approach to preference modeling1

Rough Set approach to preference modeling

  • Advantages of preference modeling byinduction from examples:

    • requires less cognitive effort from the agent,

    • agents are more confident exercising their decisions than explaining them,

    • it is concordant with the principle of posterior rationality(March, 1988)

  • Problem inconsistenciesin the set of decision examples, due to:

    • uncertainty of information – hesitation, unstable preferences,

    • incompleteness of thefamily of attributes and criteria,

    • granularity of information


Rough set approach to preference modeling2

Rough Set approach to preference modeling

  • Inconsistency w.r.t. dominance principle (semantic correlation)


Rough set approach to preference modeling3

Rough Set approach to preference modeling

  • Example of inconsistencies in preference information:

  • Examples of classification of S1 and S2 are inconsistent


Rough set approach to preference modeling4

Rough Set approach to preference modeling

  • Handlingthese inconsistencies is of crucial importance for knowledge discovery and decision support

  • Rough set theory (RST), proposed by Pawlak (1982), provides an excellent framework for dealing with inconsistency inknowledge representation

  • The philosophy of RST is based on observation that information about objects (actions) is granular, thus their representation needs a granular approximation

  • In the context of multicriteria decision support, theconcept of granular approximationhas to be adapted so as to handle semantic correlation between condition attributes and decision classes

Greco, S., Matarazzo, B., Słowiński, R.: Rough sets theory for multicriteria decision analysis. European J. of Operational Research,129 (2001) no.1, 1-47


Classical rough set approach crsa

Classical Rough Set Approach (CRSA)

  • Let U be a finite universe of discourse composed of objects (actions) described by a finite set of attributes

  • Sets of objects indiscernible w.r.t. attributescreategranules of knowledge (elementary sets)

  • Any subset XU may be expressed in terms of these granules:

    • either precisely– as a union of the granules

    • or roughly– by two ordinary sets, calledlowerandupper approximations

  • The lower approximation of Xconsists of all the granules included in X

  • The upper approximation of X consists of all the granuleshavingnon-empty intersection with X


Crsa formal definitions

CRSA – formal definitions

  • Approximation space

    U = finite set of objects (universe)

    C = set of condition attributes

    D = set of decision attributes

    CD=

    XC= – condition attribute space

    XD= – decision attribute space


Crsa formal definitions1

CRSA – formal definitions

  • Indiscernibility relation in the approximation space

    x is indiscernible with yby PCin XPiff xq=yq for all qP

    x is indiscernible with yby RDin XRiff xq=yq for all qR

    IP(x), IR(x) – equivalence classes including x

    ID makes a partition of U into decision classes Cl={Clt, t=1,...,m}

  • Granules of knowledge are bounded sets:

    IP(x) in XP and IR(x) in XR(PC and RD)

  • Classification patterns to be discovered are functions representing granules IR(x) by granules IP(x)


Crsa illustration of formal definitions

CRSA – illustration of formal definitions

  • Example

    Objects = firms


Crsa illustration of formal definitions1

attribute 1

(Investment)

40

20

attribute 2 (Sales)

0

20

40

CRSA – illustration of formal definitions

Objects in condition attribute space


Rule based multicriteria decision support using rough set approach

CRSA – illustration of formal definitions

Indiscernibility sets

a1

40

20

0

a2

20

40

Quantitative attributes are discretized according to perception of the user


Crsa illustration of formal definitions2

CRSA – illustration offormal definitions

Granules of knowlegde are bounded sets IP(x)

a1

40

20

0

a2

20

40


Crsa illustration of formal definitions3

CRSA – illustration offormal definitions

Lower approximation of class High

a1

40

20

0

a2

20

40


Crsa illustration of formal definitions4

CRSA – illustration offormal definitions

Upper approximation of class High

a1

40

20

0

a2

20

40


Rule based multicriteria decision support using rough set approach

CRSA – illustration offormal definitions

Lower approximation of class Medium

a1

40

20

0

a2

20

40


Rule based multicriteria decision support using rough set approach

CRSA – illustration offormal definitions

Upper approximation of class Medium

a1

40

20

0

a2

20

40


Crsa illustration of formal definitions5

a1

40

20

0

a2

20

40

CRSA – illustration of formal definitions

Boundary set of classes High and Medium


Crsa illustration of formal definitions6

CRSA – illustration offormal definitions

Lower = Upper approximation of class Low

a1

40

20

0

a2

20

40


Crsa formal definitions2

CRSA – formal definitions

  • Basic properies of rough approximations

  • Accuracy measures

    • Accuracyandquality of approximation of XU by attributes PC

    • Quality of approximation of classificationCl={Clt, t=1,...m}by attributes PC

    • Rough membershipof xU to XU,given PC


Crsa formal definitions3

CRSA – formal definitions

  • Cl-reduct of PC, denoted byREDCl(P),is a minimal subset P' of P which keeps the quality of classification Clunchanged, i.e.

  • Cl-core is the intersection of all the Cl-reducts of P:


Crsa decision rules induced from rough approximations2

CRSA – decision rules induced from rough approximations

  • Certain decision rule supported by objects from lower approximation of Clt (discriminant rule)

  • Possible decision rule supported by objects from upper approximation of Clt (partly discriminant rule)

  • Approximate decision rule supported by objects from the boundary of Clt

    where Clt,Cls,...,Clu are classes to which belong inconsistent objects supporting this rule


Crsa summary of useful results

CRSA – summary of useful results

  • Characterization of decision classes (even in case of inconsistency) in terms of chosen attributes bylower and upper approximation

  • Measure of the quality of approximation indicating how good the chosen set of attributes is for approximation of the classification

  • Reduction of knowledge contained in the table to the description by relevant attributes belonging to reducts

  • The core of attributes indicating indispensable attributes

  • Decision rules induced from lower and upper approximations of decision classes show classification patterns existing in data


Dominance based rough set approach drsa to mcda

Dominance-based Rough Set Approach (DRSA) to MCDA

  • Sets of condition (C) and decision(D) criteria are semanticallycorrelated

  • q – weak preference relation(outranking) on Uw.r.t. criterion q{CD}(complete preorder)

  • xq q yq : “xq is at least as good as yq on criterion q”

  • xDPy:x dominates y with respect to PC in condition space XPif xq q yq for all criteria qP

  • is a partial preorder

  • Analogically,we definexDRyin decision space XR, RD


Dominance based rough set approach drsa

Dominance-based Rough Set Approach (DRSA)

  • For simplicity : D={d}

  • Id makes a partition of U into decision classes Cl={Clt, t=1,...,m}

  • [xClr, yCls, r>s]  xy (xy and not yx)

  • In order to handle semantic correlation between condition and decision criteria:

    – upward union of classes, t=2,...,n(„at least” class Clt)

    – downward union of classes, t=1,...,n-1(„at most” class Clt)

  • are positive and negative dominance cones in XD, with D reduced to single dimension d


Dominance based rough set approach drsa1

Dominance-based Rough Set Approach (DRSA)

  • Granular computing with dominance cones

  • Granules of knowledge are open setsin condition spaceXP(PC)

    (x)= {yU: yDPx} : P-dominating set

    (x) = {yU: xDPy} : P-dominated set

  • P-dominating and P-dominated sets are positive and negative dominance cones in XP

  • Sorting patterns(preference model) to be discovered are functions representinggranules , by granules


Drsa illustration of formal definitions

DRSA – illustration of formal definitions

  • Example

     


Drsa illustration of formal definitions1

criterion 1

(Investment)

40

20

0

criterion 2

(Sales)

20

40

DRSA – illustration of formal definitions

Objects in condition criteria space


Drsa illustration of formal definitions2

c1

40

x

20

0

c2

20

40

DRSA – illustration of formal definitions

Granular computing with dominance cones


Drsa illustration of formal definitions3

c1

40

20

0

c2

20

40

DRSA – illustration of formal definitions

Granular computing with dominance cones


Drsa illustration of formal definitions4

c1

40

20

0

20

40

DRSA – illustration of formal definitions

Lower approximation of upward union of class High

c2


Drsa illustration of formal definitions5

c1

40

20

0

c2

20

40

DRSA – illustration of formal definitions

Upper approximation and the boundary of upward union of class High


Drsa illustration of formal definitions6

c1

40

20

0

c2

20

40

DRSA – illustration of formal definitions

Lower = Upper approximation of upward union of class Medium


Drsa illustration of formal definitions7

c1

40

20

0

c2

20

40

DRSA – illustration of formal definitions

Lower = upper approximation of downward union of class Low


Drsa illustration of formal definitions8

c1

40

20

0

c2

20

40

DRSA – illustration of formal definitions

Lower approximation of downward union of class Medium


Drsa illustration of formal definitions9

c1

40

20

0

c2

20

40

DRSA – illustration of formal definitions

Upper approximation and the boundary of downward union of class Medium


Dominance based rough set approach vs classical rsa

Classes:

a1

c1

40

20

0

c2

20

40

a2

0

20

40

Dominance-based Rough Set Approach vs. Classical RSA

Comparison of CRSA and DRSA

40

20


Variable consistency drsa vc drsa

Variable-Consistency DRSA (VC-DRSA)

Variable-Consistency DRSA for consistency level l[0,1]

e.g. l = 0.8

x


Drsa formal definitions

DRSA – formal definitions

  • Basic properies of rough approximations

    , for t=2,…,m

  • Identity of boundaries , for t=2,…,m

  • Quality of approximation of sortingCl={Clt, t=1,...,m}by criteria PC

  • Cl-reductsandCl-coreofPC


Drsa induction of decision rules from rough approximations

DRSA – induction of decision rules from rough approximations

  • Induction of decision rules from rough approximations

    • certain D-decision rules, supported by objects without ambiguity:

      ifxq1q1rq1and xq2q2rq2and … xqpqprqp, thenx

    • possible D-decision rules, supported by objects  with or without any ambiguity:

      ifxq1q1rq1and xq2q2rq2and … xqpqprqp, thenxpossibly


Drsa induction of decision rules from rough approximations1

DRSA – induction of decision rules from rough approximations

  • Induction of decision rules from rough approximations

    • certain D-decision rules, supported by objects without ambiguity:

      ifxq1q1rq1andxq2q2rq2and … xqpqprqp, thenx

    • possible D-decision rules, supported by objects with or without any ambiguity:

      ifxq1q1rq1and xq2q2rq2and … xqpqprqp, thenxpossibly

    • approximate D-decision rules, supported by objects ClsCls+1…Clt without possibility of discerning to which class:

      if  xq1q1rq1 and... xqkqkrqk andxqk+1qk+1rqk+1and ... xqpqprqp, thenxClsCls+1…Clt.


Drsa decision rules

DRSA – decision rules

Certain D-decision rules for the class High

c1

40

-base object

20

Iff(x,c1)35, thenxbelongs to class „High”

0

c2

20

40


Drsa decision rules1

DRSA – decision rules

Possible D-decision rules for the class High

c1

40

-base object

20

Iff(x,c1)22.5&f(x,c2)20, thenxpossibly belongs to class „High”

0

c2

20

40


Drsa decision rules2

DRSA – decision rules

Approximate D-decision rules for the class Medium or High

c1

40

- base objects

20

Iff(x,c1)[22.5, 27]&f(x,c2)[20, 25], thenxbelongs to class „Medium” or „High”

0

c2

20

40


Measures characterizing decision rules in system s u c d

Measures characterizing decision rulesin systemS=U, C, D

  • Support of a rule:

  • Strength of a rule :

  • Certainty (or confidence) factor of a rule (Łukasiewicz, 1913):

  • Coverage factor of a rule :

  • Relation between certainty, coverage and Bayes theorem:


Measures characterizing decision rules in system s u c d1

Measures characterizing decision rulesin systemS=U, C, D

  • For a given decision table S, the probability (frequency) is calculated as:

  • With respect to data from decision table S, the concepts of a priori & a posteriori probability are no more meaningful:

  • Bayesian confirmation measure adapted to decision rules :

  • „ is verified more often, when is verified, rather than when is not verified”

  • Monotonicity:c(,) = F[suppS(,), suppS(,), suppS(,), suppS(,)]is a function non-decreasingwith respect tosuppS(,)&suppS(,)and non-increasingwith respect tosuppS(,)&suppS(,)

    S.Greco, Z.Pawlak, R.Słowiński: Can Bayesian confirmation measures be useful for rough set decision rules? Engineering Appls. of Artificial Intelligence (2004)


Drsa decision rules3

DRSA – decision rules

Certain D-decision rules for at least Medium

c1

40

20

0

c2

20

40


Drsa decision rules4

DRSA – decision rules

Decision rule wih a hyperplane for at least Medium

c1

40

20

If f(x,c1)9.75 andf(x,c2)9.5 and 1.5 f(x,c1)+1.0 f(x,c2)22.5,

then x is at least Medium

0

c2

20

40


Drsa application of decision rules

DRSA – application of decision rules

  • Applicationof decision rules: „intersection” of rules matching object x

Final assignment:


Rough set approach to multiple criteria sorting

Rough Set approach to multiple-criteria sorting

  • Example of preference information about students:

  • Examples of classification of S1 and S2 are inconsistent


Rough set approach to multiple criteria sorting1

Rough Set approach to multiple-criteria sorting

  • If we eliminate Literature, then more inconsistencies appear:

  • Examples of classification of S1, S2, S3 and S5 are inconsistent


Rough set approach to multiple criteria sorting2

Rough Set approach to multiple-criteria sorting

  • Elimination of Mathematics does not increase inconsistencies:

  • Subset of criteria {Ph,L} is a reduct of {M,Ph,L}


Rough set approach to multiple criteria sorting3

Rough Set approach to multiple-criteria sorting

  • Elimination of Physics also does not increase inconsistencies:

  • Subset of criteria {M,L} is a reduct of {M,Ph,L}

  • Intersection of reducts {M,L} and {Ph,L} gives the core {L}


Rough set approach to multiple criteria sorting4

Literature

good  medium  bad

S5,S6

good

S8

S3

S4

medium

S7

S2

S1

bad

Mathematics

bad

medium

good

Rough Set approach to multiple-criteria sorting

  • Let us represent the students in condition space {M,L}:

     


Rough set approach to multiple criteria sorting5

Literature

good  medium  bad

S5,S6

good

S8

S3

S4

medium

S7

S2

S1

bad

Mathematics

bad

medium

good

Rough Set approach to multiple-criteria sorting

  • Dominance cones in condition space {M,L}:

     

  • P={M,L}


Rough set approach to multiple criteria sorting6

Literature

good  medium  bad

S5,S6

good

S8

S3

S4

medium

S7

S2

S1

bad

Mathematics

bad

medium

good

Rough Set approach to multiple-criteria sorting

  • Dominance cones in condition space {M,L}:

     

  • P={M,L}


Rough set approach to multiple criteria sorting7

Literature

good  medium  bad

S5,S6

good

S8

S3

S4

medium

S7

S2

S1

bad

Mathematics

bad

medium

good

Rough Set approach to multiple-criteria sorting

  • Dominance cones in condition space {M,L}:

     

  • P={M,L}


Rough set approach to multiple criteria sorting8

Literature

good  medium  bad

S5,S6

good

S8

S3

S4

medium

S7

S2

S1

bad

Mathematics

bad

medium

good

Rough Set approach to multiple-criteria sorting

  • Lower approximation of at leastgood students:

     

  • P={M,L}


Rough set approach to multiple criteria sorting9

Literature

good  medium  bad

S5,S6

good

S8

S3

S4

medium

S7

S2

S1

bad

Mathematics

bad

medium

good

Rough Set approach to multiple-criteria sorting

  • Upper approximation of at leastgood students:

     

  • P={M,L}


Rough set approach to multiple criteria sorting10

Literature

good  medium  bad

S5,S6

good

S8

S3

S4

medium

S7

S2

S1

bad

Mathematics

bad

medium

good

Rough Set approach to multiple-criteria sorting

  • Lower approximation of at leastmedium students:

     

  • P={M,L}


Rough set approach to multiple criteria sorting11

Literature

good  medium  bad

S5,S6

good

S8

S3

S4

medium

S7

S2

S1

bad

Mathematics

bad

medium

good

Rough Set approach to multiple-criteria sorting

  • Upper approximation of at leastmedium students:

     

  • P={M,L}


Rough set approach to multiple criteria sorting12

Literature

good  medium  bad

S5,S6

good

S8

S3

S4

medium

S7

S2

S1

bad

Mathematics

bad

medium

good

Rough Set approach to multiple-criteria sorting

  • Boundary region of at leastmedium students:

     

  • P={M,L}


Rough set approach to multiple criteria sorting13

Literature

good  medium  bad

S5,S6

good

S8

S3

S4

medium

S7

S2

S1

bad

Mathematics

bad

medium

good

Rough Set approach to multiple-criteria sorting

  • Lower approximation of at mostmedium students:

     

  • P={M,L}


Rough set approach to multiple criteria sorting14

Literature

good  medium  bad

S5,S6

good

S8

S3

S4

medium

S7

S2

S1

bad

Mathematics

bad

medium

good

Rough Set approach to multiple-criteria sorting

  • Upper approximation of at most medium students:

     

  • P={M,L}


Rough set approach to multiple criteria sorting15

Literature

good  medium  bad

S5,S6

good

S8

S3

S4

medium

S7

S2

S1

bad

Mathematics

bad

medium

good

Rough Set approach to multiple-criteria sorting

  • Lower approximation of at mostbad students:

     

  • P={M,L}


Rough set approach to multiple criteria sorting16

Literature

good  medium  bad

S5,S6

good

S8

S3

S4

medium

S7

S2

S1

bad

Mathematics

bad

medium

good

Rough Set approach to multiple-criteria sorting

  • Upper approximation of at most bad students:

     

  • P={M,L}


Rough set approach to multiple criteria sorting17

Literature

good  medium  bad

S5,S6

good

S8

S3

S4

medium

S7

S2

S1

bad

Mathematics

bad

medium

good

Rough Set approach to multiple-criteria sorting

  • Boundary region of at mostbad students:

     

  • P={M,L}


Rough set approach to multiple criteria sorting18

Literature

good  medium  bad

S5,S6

good

S8

S3

S4

medium

S7

S2

S1

bad

Mathematics

bad

medium

good

Rough Set approach to multiple-criteria sorting

  • Decision rules in terms of {M,L}:

     

  • D> - certain rule

If M  good & L  medium, then student good


Rough set approach to multiple criteria sorting19

Literature

good  medium  bad

S5,S6

good

S8

S3

S4

medium

S7

S2

S1

bad

Mathematics

bad

medium

good

Rough Set approach to multiple-criteria sorting

  • Decision rules in terms of {M,L}:

     

  • D> - certain rule

If M  medium & L  medium, then student medium


Rough set approach to multiple criteria sorting20

Literature

good  medium  bad

S5,S6

good

S8

S3

S4

medium

S7

S2

S1

bad

Mathematics

bad

medium

good

Rough Set approach to multiple-criteria sorting

  • Decision rules in terms of {M,L}:

     

  • D>< - approximate rule

If M  medium & L  bad, then student is bad or medium


Rough set approach to multiple criteria sorting21

Literature

good  medium  bad

S5,S6

good

S8

S3

S4

medium

S7

S2

S1

bad

Mathematics

bad

medium

good

Rough Set approach to multiple-criteria sorting

  • Decision rules in terms of {M,L}:

     

  • D< - certain rule

If M  medium, then student medium


Rough set approach to multiple criteria sorting22

Literature

good  medium  bad

S5,S6

good

S8

S3

S4

medium

S7

S2

S1

bad

Mathematics

bad

medium

good

Rough Set approach to multiple-criteria sorting

  • Decision rules in terms of {M,L}:

     

  • D< - certain rule

If L  bad, then student medium


Rough set approach to multiple criteria sorting23

Literature

good  medium  bad

S5,S6

good

S8

S3

S4

medium

S7

S2

S1

bad

Mathematics

bad

medium

good

Rough Set approach to multiple-criteria sorting

  • Decision rules in terms of {M,L}:

     

  • D< - certain rule

If M  bad, then student bad


Rough set approach to multiple criteria sorting24

Rough Set approach to multiple-criteria sorting

  • Set of decision rules in terms of {M, L} representing preferences:

    If M  good & L  medium, then student good {S4,S5,S6}

    If M  medium & L  medium, then student medium {S3,S4,S5,S6}

    If M  medium & L  bad, then student is bad or medium {S1,S2}

    If M  medium, then student medium{S2,S3,S7,S8}

    If L  bad, then student medium {S1,S2,S7}

    If M  bad, then student bad {S7,S8}


Rough set approach to multiple criteria sorting25

Rough Set approach to multiple-criteria sorting

  • Set of decision rules in terms of {M,Ph,L} representing preferences:

    If M  good & L  medium, then student good {S4,S5,S6}

    If M  medium & L  medium, then student medium {S3,S4,S5,S6}

    If M  medium & L  bad, then student is bad or medium {S1,S2}

    If Ph  medium & L  medium then student medium{S1,S2,S3,S7,S8}

    If M  bad, then student bad {S7,S8}

  • The preference model involving all three criteria is more concise


Rough set approach to multiple criteria sorting26

Rough Set approach to multiple-criteria sorting

  • Importance and interaction among criteria

  • Quality of approximation of sortingP(Cl) (PC) is a fuzzy measure with the property of Choquet capacity ((Cl)=0, C(Cl)=r and R(Cl)P(Cl)r for any RPC)

  • Such measure can be used to calculate Shapley value or Benzhafindex, i.e. an average „contribution” of criterion q in all coalitions of criteria,q{1,…,m}

  • Fuzzy measure theory permits, moreover, to calculate interaction indices(Murofushi & Soneda, Grabisch or Roubens) for pairs (or larger subsets) of criteria, i.e. an average „added value” resulting from putting together q and q’ in all coalitions of criteria, q,q’{1,…,m}


Rough set approach to multiple criteria sorting27

Rough Set approach to multiple-criteria sorting

  • Quality of approximation of sorting students

    C(Cl)= [8-card({S1,S2})]/8 = 0.75


Drsa preference modeling by decision rules

DRSA – preference modeling by decision rules

  • A set of (D D D)-rules induced from rough approximations represents a preference model of a Decision Maker

  • Traditional preference models:

    • utility function (e.g. additive, multiplicative, associative, Choquet integral, Sugeno integral),

    • binary relation (e.g. outranking relation, fuzzy relation)

  • Decision rule model is the most general model of preferences:a general utility function, Sugeno or Choquet inegral, or outranking relation exists if and only if there exists the decision rule model

    Słowiński, R., Greco, S., Matarazzo, B.: “Axiomatization of utility, outranking and decision-rule preference models for multiple-criteria classification problems under partial inconsistency with the dominance principle”, Control and Cybernetics, 31 (2002) no.4, 1005-1035


Drsa preference modeling by decision rules1

DRSA – preference modeling by decision rules

  • Representation axiom(cancellation property):for every dimensioni=1,…,n, for every evaluationxi,yiXianda-i,b-iX-i, and for every pair of decision classes Clr,Cls{Cl1,…,Clm}:

    {xia-iClr andyib-iCls} {yia-i orxib-i}

  • The above axiom constitutes a minimal condition that makes the weak preference relation i a complete preorder

  • This axiom does not require pre-definition of criteria scales gi, nor the dominance relation, in order to derive 3 preference models: general utility function, outranking relation, set of decision rules Dor D

    Greco, S., Matarazzo, B., Słowiński, R.:Conjoint measurement and rough set approach for multicriteria sorting problems in presence of ordinal criteria. [In]: A.Colorni, M.Paruccini, B.Roy (eds.), A-MCD-A: Aide Multi Critère à la Décision – Multiple Criteria Decision Aiding, European Commission Report EUR 19808 EN, Joint Research Centre, Ispra, 2001, pp. 117-144


Comparison of decision rule preference model and utility function

Cl1

Cl2

Clp-1

Clp

z1

z2

zp-2

zp-1

U

U[g1(a),…,gn(a)]

U[g1(d),…,gn(d)]

Comparison of decision rule preference model and utility function

  • Value-driven methods

  • The preference model is a utility functionUand a set of thresholdszt, t=1,…,p-1, on U, separating the decision classesClt, t=0,1,…,p

  • A value of utility function U is calculated for each action aA

  • e.g. aCl2, dClp1


Comparison of decision rule preference model and outranking relation

Cl1

Cl2

Clp-1

Clp

b0

b1

b2

bp-2

bp-1

bp

g1

g2

g3

a

d

gn

Comparison of decision rule preference model and outranking relation

  • ELECTRE TRI

  • Decision classes Clt are caracterized by limit profilesbt, t=0,1,…,p

  • The preference model, i.e. outranking relation S, is constructed for each couple (a, bt), for every aA and bt, t=0,1,…,p


Comparison of decision rule preference model and outranking relation1

Cl1

Cl2

Clp-1

Clp

comparison of action a to profiles bt

b0

b1

b2

bp-2

bp-1

bp

g1

g2

g3

a

d

gn

Comparison of decision rule preference model and outranking relation

  • ELECTRE TRI

  • Decision classes Clt are caracterized by limit profilesbt, t=0,1,…,p

  • Compare action a successively to each profile bt, t=p-1,…,1,0; if bt is the first profile such that aSbt, then aClt+1

  • e.g. aCl1, dClp1


Comparison of decision rule preference model and outranking relation2

e.g.for Cl2>

r1

r3

g1

r2

g2

g3

gn

a

Comparison of decision rule preference model and outranking relation

  • Rule-based classification

  • The preference model is a set of decision rules for unions, t=2,…,p

  • A decision rule compares an action profile to a partial profile using a dominance relation

  • e.g. aCl2>, because profile of a dominates partial profiles of r2 and r3


An impact of drsa on knowledge discovery from data bases

An impact of DRSA on Knowledge Discovery from data bases

  • Example of diagnostic data

  • 76 buses (objects)

  • 8 symptoms (attributes)

  • Decision = technical state:

    3 – good state (in use)

    2 – minor repair

    1 – major repair (out of use)

  • Discover patterns = find relationships between symptoms and the technical state

  • Patterns explain expert’s decisions and support diagnosis of new buses


Drsa example of technical diagnostics

DRSA – example of technical diagnostics

  • 76 vehicles (objects)

  • 8 attributes with preference scale

  • decision = technical state:

    3 – good state (in use)

    2 – minor repair

    1 – major repair (out of use)

  • all criteria are semantically correlated with the decision

  • inconsistent objects:

    11, 12, 39


Drsa for multiple criteria choice and ranking

DRSA for multiple-criteria choice and ranking

  • Input (preference) information:

AR

BARAR

S – outranking

Sc – non-outranking

i – differenceon qi

Pairwise

Comparison

Table

(PCT)


Drsa for multiple criteria choice and ranking1

For ordinal criterionq C:

q

q

x

w

(x,y)Dq(w,z)

z

y

DRSA for multiple-criteria choice and ranking

  • Dominance for pairs of actions(x,y),(w,z)UU

    For cardinal criterionqC:

    (x,y)Dq(w,z)

    ifand

    where hqkq

  • For subset PC of criteria: P-dominance relation on pairsof actions:

    (x,y)DP(w,z)if (x,y)Dq(w,z)for all qP i.e.,

    if x is preferred to yat least as much as w is preferred to zfor all qP

  • Dq is reflexive, transitive, but not necessarily complete (partial preorder)is a partial preorder on AA


Drsa for multiple criteria choice and ranking2

DRSA for multiple-criteria choice and ranking

  • Let BUU be a set of pairs of reference objectsin a given PCT

  • Granules of knowledge

    • positive dominance cone

    • negative dominance cone


Drsa for multiple criteria choice and ranking formal definitions

DRSA for multiple-criteria choice and ranking – formal definitions

  • P-lower and P-upper approximations of outranking relations S:

  • P-lower and P-upper approximations of non-outranking relation Sc:

  • P-boundaries of S and Sc:

    PC


Drsa for multiple criteria choice and ranking formal definitions1

DRSA for multiple-criteria choice and ranking – formal definitions

  • Basic properties:

  • Quality of approximation of S and Sc:

  • (S,Sc)-reduct and (S,Sc)-core

  • Variable-consistency rough approximations of S and Sc (l(0,1]) :


Drsa for multiple criteria choice and ranking decision rules

DRSA for multiple-criteria choice and ranking – decision rules

  • Decision rules (for criteria with cardinal scales)

    • D-decision rules

      if xyandxyand ... xy, thenxSy

    • D-decision rules

      ifxyandxyand ... xy, thenxScy

    • D-decision rules

      ifxyandxyand ... xy

      andxyand ... xy, thenxSyorxScy


Example of application of drsa

q1 = 2.3 q2 = 18.0 … qm = 3.0

q1 = 1.2 q2 = 19.0 … qm = 5.7

q1 = 4.7 q2 = 14.0 … qm = 7.1

q1 = 0.5 q2 = 12.0 … qm = 9.0

Example of application of DRSA

  • Acquiring reference objects

  • Preference information on reference objects:

    • making a ranking

    • pairwise comparison of the objects (x Sy) or (x Scy)

  • Building the Pairwise Comparison Table (PCT)

  • Inducingrules from rough approximations of relations Sand Sc

u

y

x

z


Induction of decision rules from rough approximations of s and s c

 q2

y S u

y S z

ifq1(a,b) ≥ -2.4 & q2(a,b) ≥ 4.0 then aSb

u S z

ifq1(a,b) ≤ -1.1 & q2(a,b) ≤ 1.0 then aScb

if q1(a,b) ≥ 4.1 & q2(a,b) ≥ 1.9 then aSb

x Sc y

t Sc q

y S x

 q1

u Sc z

ifq1(a,b) ≤ 2.4 & q2(a,b) ≤ -4.0 then aScb

z Sc y

u Sc y

Induction of decision rules from rough approximations of S and Sc


Application of decision rules for decision support

S

S

y

y

x

x

Sc

xSTy

xSKy

y

y

x

x

Sc

xSUy

xSFy

The 4-valued outranking underlines the presence and the absence of positive and negative reasons of outranking

Application of decision rules for decision support

  • Application of decision rules (preference model) on the wholesetA induces a specific preference structure on A

  • Any pair of objects (x,y)AA can match the decision rules in one of four ways:

    • xSy and notxScy, that istrueoutranking (xSTy),

    • xScy and notxSy, that is falseoutranking (xSFy),

    • xSy and xScy, that is contradictoryoutranking (xSKy),

    • notxSy and notxScy, that is unknownoutranking (xSUy).


Drsa for multicriteria choice and ranking net flow score

S – positive argument

Sc – negative argument

(+,–)

(+,+)

y

u

S

S

weakness of x

strength of x

x

Sc

Sc

z

v

(–,+)

(–,–)

NFS(x) = strength(x) – weakness(x)

DRSA for multicriteria choice and ranking – Net Flow Score


Drsa for multiple criteria choice and ranking final recommendation

DRSA for multiple-criteria choice and ranking – final recommendation

  • Exploitation of the preference structure by the Net Flow Score procedure for each action xA:

    NFS(x) = strength(x) – weakness(x)

  • Final recommendation:

    ranking:complete preorder determined by NSF(x) inA

    best choice: action(s)x*A such thatNSF(x*)= max {NSF(x)}

xA


Drsa for multiple criteria choice and ranking example

DRSA for multiple-criteria choice and ranking – example

  • Decision table with reference objects (warehouses)


Drsa for multicriteria choice and ranking example

DRSA for multicriteria choice and ranking – example

  • Pairwise comparison table (PCT)


Drsa for multicriteria choice and ranking example1

DRSA for multicriteria choice and ranking – example

  • Quality of approximation of S and Sc by criteria from set C is 0.44

  • REDPCT= COREPCT = {A1,A2,A3}

  • D-decision rules and D-decision rules


Drsa for multicriteria choice and ranking example2

DRSA for multicriteria choice and ranking – example

  • D-decision rules


Drsa for multicriteria choice and ranking example3

DRSA for multicriteria choice and ranking – example

  • Ranking of warehouses for sale by decision rules and the NFS procedure

  • Finalranking: (2',6')  (8')  (9')  (1')  (4')  (5')  (3')  (7',10')

  • Best choice: select warehouse 2' and 6' having maximum score (11)


Other extensions of drsa

Other extensions of DRSA

  • DRSA for choice and ranking with graded preference relations

  • DRSA for choice and ranking with Lorenz dominance relation

  • DRSA for decision with multiple decision makers

  • DRSA with missing valuesof attributes and criteria

  • DRSA for hierarchical decision making

  • Fuzzy set extensions of DRSA

  • DRSA for decision under risk

  • Discovering association rules in preference-ordered data sets

  • Building a strategy of efficient intervention based on decision rules


Conclusions

Conclusions

  • Preference information is given in terms of decision examples on reference actions

  • Preference model induced from rough approximations of unions of decision classes (or preference relations S and Sc) is expressed in a natural and comprehensible language of "if..., then..." decision rules

  • Decision rules do not convert ordinal information into numeric oneand can represent inconsistent preferences

  • Heterogeneous information(attributes, criteria)and scales of preference(ordinal, cardinal)can be processed within DRSA

  • DRSA supplies useful elements of knowledge about decision situation:

    • certain and doubtful knowledge distinguished by lower and upper approximations

    • relevance of particular attributes or criteria and information about their interaction

    • reducts of attributes or criteria conveying important knowledge contained in data

    • the core of indispensable attributes and criteria

    • decision rules can be used for both explanation of past decisions and decision support


Free software available

Free software available

ROSE

ROugh Set data Explorer

4eMka

Dominance-based Rough Set Approach to Multicriteria Classification

JAMM

A New Decision Support Tool for Analysis and Solving of Multicriteria Classification Problems

http://idss.cs.put.poznan.pl/site/software.html


  • Login