1 / 16

Constructing Associative Classifiers from Decision Tables

Constructing Associative Classifiers from Decision Tables. Jianchao Han California State University Dominguez Hills, USA T. Y. Lin San Jose State University, USA Jiye Li University of Waterloo, Canada Nick Cercone York University, Canada. Agenda. Introduction Related Work

charla
Download Presentation

Constructing Associative Classifiers from Decision Tables

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Constructing Associative Classifiers from Decision Tables Jianchao Han California State University Dominguez Hills, USA T. Y. Lin San Jose State University, USA Jiye Li University of Waterloo, Canada Nick Cercone York University, Canada RSFDGrC - 2007

  2. Agenda • Introduction • Related Work • Our Approach • Algorithm Description • An Example Demonstration • Conclusion RSFDGrC - 2007

  3. Introduction • Associative classifier • A set of classification rules • Classification rules as a special form of association rules • Classifier formed by finding constrained association rules • Rough set theory used to reduce data set RSFDGrC - 2007

  4. Related Work • Rough set theory • Attribute reduct • Association rules • Rule template • Constrained rules • Significant rules • Rule importance measurement • Michalski’s coverage method RSFDGrC - 2007

  5. Our Approach • Combining three strategies • Association rules • Rough set theory to find attribute reducts • Coverage method to form a classifier • Four steps • Finding attribute reducts • Finding constrained association rules • Measuring association rules • Selecting important rules to cover instances RSFDGrC - 2007

  6. Generate All Attribute Reducts • Use existing reduct finding algorithms such as Genetic Reduct generation algorithm in ROSETTA RSFDGrC - 2007

  7. Find Classification Rules • For each attribute reduct • Use adapted Apriori algorithm to find constrained association rules, where right side of rules is constrained to a class label • Carefully determine the thresholds of support and confident for association rules RSFDGrC - 2007

  8. Measure Importance of Rules • Rule importance definition • Properties of the rule importance • 0 < Importance(Rule) ≤ 1 • If Rule only contains core attributes, its importance is 1. RSFDGrC - 2007

  9. Rule Precedence • Given two rules R1 and R2 generated, R1 precedes R2 (R1 has a higher precedence than R2), denoted Precedence(R1)>Precedence(R1), if • Importance(R1) > Importance(R2); or • Importance(R1) = Importance(R2), and Confidence(R1) > Confidence(R2); or • Importance(R1) = Importance(R2), and Confidence(R1) = Confidence(R2), and Support(R1) > Support (R2). • Otherwise R1 and R2 are considered having the same precedence and denoted Precedence(R1)=Precedence(R2). RSFDGrC - 2007

  10. Property of Rule Precedence • The precedence relationship is a total order relation • Thus all rules can be sorted based on their precedence RSFDGrC - 2007

  11. Find Associative Classifier • Sort all rules in terms of their precedence consisting of importance, confidence and support • Select next rule in the sorted sequence • If this rule covers some rows • Delete all rows covered by this rule • Put this rule in the classifier • Repeat until all rules are exhausted RSFDGrC - 2007

  12. An Example RSFDGrC - 2007

  13. Find All Attribute Reducts • Genetic reducer in ROSETTA • Four attribute reducts RSFDGrC - 2007

  14. Constrained Associated Rules • support threshold = 1% • confidence threshold = 100% • applying the adapted Apriori algorithm RSFDGrC - 2007

  15. Construct Classifier • Covering method • Rule 1: Covers 5 rows 9 through 13 • Rule 2: Covers 2 rows 8 and 14 • Rule 3: Covers 4 rows 1, 3, 6, and 7 • Rule 4: Covers 3 rows 2, 4 and 5 • Since all rows in the original decision table have been covered by Rules 1 through 4, the final associative classifier contains only these four class association rules RSFDGrC - 2007

  16. Conclusion • Introduce an approach to constructing associative classifiers based on • Rough set theory to find attribute reducts • Association rule mining algorithm • Covering method to build classifiers • Present the rule importance and precedence measurement used in the proposed approach • Demonstrate an example RSFDGrC - 2007

More Related