1 / 22

CSCI 3202: Introduction to AI Decision Trees

CSCI 3202: Introduction to AI Decision Trees. Greg Grudic (Notes borrowed from Thomas G. Dietterich and Tom Mitchell). Outline. Decision Tree Representations ID3 and C4.5 learning algorithms (Quinlan 1986) CART learning algorithm (Breiman et al. 1985) Entropy, Information Gain Overfitting.

wright
Download Presentation

CSCI 3202: Introduction to AI Decision Trees

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSCI 3202: Introduction to AIDecision Trees Greg Grudic (Notes borrowed from Thomas G. Dietterich and Tom Mitchell) Decision Trees

  2. Outline • Decision Tree Representations • ID3 and C4.5 learning algorithms (Quinlan 1986) • CART learning algorithm (Breiman et al. 1985) • Entropy, Information Gain • Overfitting Decision Trees

  3. Training Data Example: Goal is to Predict When This Player Will Play Tennis? Decision Trees

  4. Decision Trees

  5. Decision Trees

  6. Decision Trees

  7. Decision Trees

  8. Learning Algorithm for Decision Trees What happens if features are not binary? What about regression? Decision Trees

  9. Choosing the Best Attribute A1 and A2 are “attributes” (i.e. features or inputs). Number + and – examples before and after a split. - Many different frameworks for choosing BEST have been proposed! - We will look at Entropy Gain. Decision Trees

  10. Entropy Decision Trees

  11. Entropy is like a measure of purity… Decision Trees

  12. Entropy Decision Trees

  13. Information Gain Decision Trees

  14. Training Example Decision Trees

  15. Selecting the Next Attribute Decision Trees

  16. Decision Trees

  17. Non-Boolean Features • Features with multiple discrete values • Multi-way splits • Test for one value versus the rest • Group values into disjoint sets • Real-valued features • Use thresholds • Regression • Splits based on mean squared error metric Decision Trees

  18. Hypothesis Space Search • You do not get the globally optimal tree! • Search space is exponential. Decision Trees

  19. Overfitting Decision Trees

  20. Overfitting in Decision Trees Decision Trees

  21. Validation Data is Used to Control Overfitting • Prune tree to reduce error on validation set Decision Trees

  22. Quiz 7 • What is overfitting? • Are decision trees universal approximators? • What do decision tree boundaries look like? • Why are decision trees not globally optimal? • Do Support Vector Machines give the globally optimal solution (given the SVN optimization criteria and a specific kernel)? • Does the K Nearest Neighbor algorithm give the globally optimal solution if K is allowed to be as large as the number of training examples (N), N fold cross validation is used, and the distance metric is fixed? Decision Trees

More Related