1 / 10

1. Stat 231. A.L. Yuille. Fall 2004

1. Stat 231. A.L. Yuille. Fall 2004. AdaBoost.. Summary and Extensions. Read Viola and Jones Handout. 2. Basic AdaBoost Review. Data Set of weak classifiers Weights Parameters Strong Classifier:. 3. Basic AdaBoost Algorithm. Initialize Update Rule:

aimon
Download Presentation

1. Stat 231. A.L. Yuille. Fall 2004

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 1. Stat 231. A.L. Yuille. Fall 2004 • AdaBoost.. • Summary and Extensions. • Read Viola and Jones Handout. Lecture notes for Stat 231: Pattern Recognition and Machine Learning

  2. 2. Basic AdaBoost Review • Data • Set of weak classifiers • Weights • Parameters • Strong Classifier: Lecture notes for Stat 231: Pattern Recognition and Machine Learning

  3. 3. Basic AdaBoost Algorithm • Initialize • Update Rule: where Z is the normalization constant. • Let • Pick classifier to minimize • Set • Repeat. Lecture notes for Stat 231: Pattern Recognition and Machine Learning

  4. 4. Basic AdaBoost Algorithm • .Errors: • Bounded by, which equals • AdaBoost is a greedy algorithm that tries to minimize the bound by minimizing the Z’s in order w.r.t. Lecture notes for Stat 231: Pattern Recognition and Machine Learning

  5. 5. AdaBoost Variant 1. • In preparation for Viola and Jones. New parameter • Strong classifier • Modify update rule: • Let be the sum of weights if weak class is p, true class q. • Pick weak classifier to minimize set Lecture notes for Stat 231: Pattern Recognition and Machine Learning

  6. 6. AdaBoost Variant 1. • As before: the error is bounded by • Same “trick” If weak classifier is right then: If weak classifier is wrong then: Lecture notes for Stat 231: Pattern Recognition and Machine Learning

  7. 7. AdaBoost Variant 2. • We have assumed a loss function which pays equal penalties for false positives and false negatives. • But we may want false negatives to cost more (Viola and Jones). • Use loss function: Lecture notes for Stat 231: Pattern Recognition and Machine Learning

  8. 8. AdaBoost Variant 2. • Modify the update rule: • Verify that the loss: • Same update rule as for Variant 1, except Lecture notes for Stat 231: Pattern Recognition and Machine Learning

  9. 9. AdaBoost Extensions • AdaBoost can be extended to multiclasses: (Singer and Schapire) • The weak classifiers can have take multiple values. • The conditional probability interpretation applies to these extensions. Lecture notes for Stat 231: Pattern Recognition and Machine Learning

  10. 10. AdaBoost Summary • Basic AdaBoost:. Combine weak classifiers to make a strong classifier. • Dynamically weight the data, so that misclassified data weighs more (like SVM pay more attention to hard-to-classify data). • Exponential convergence to empirical risk (weak conditions). • Useful for combining weak cues for Visual Detection tasks. • Probabilistic Interpretation/Multiclass/Multivalued classifiers. Lecture notes for Stat 231: Pattern Recognition and Machine Learning

More Related