a brief introduction to adaboost l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
A Brief Introduction to Adaboost PowerPoint Presentation
Download Presentation
A Brief Introduction to Adaboost

Loading in 2 Seconds...

play fullscreen
1 / 35

A Brief Introduction to Adaboost - PowerPoint PPT Presentation


  • 353 Views
  • Uploaded on

A Brief Introduction to Adaboost. Hongbo Deng 6 Feb, 2007. Some of the slides are borrowed from Derek Hoiem & Jan ˇSochman . Outline. Background Adaboost Algorithm Theory/Interpretations. What’s So Good About Adaboost. Can be used with many different classifiers

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

A Brief Introduction to Adaboost


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
a brief introduction to adaboost

A Brief Introduction to Adaboost

Hongbo Deng

6 Feb, 2007

Some of the slides are borrowed from Derek Hoiem & Jan ˇSochman.

outline
Outline
  • Background
  • Adaboost Algorithm
  • Theory/Interpretations
what s so good about adaboost
What’s So Good About Adaboost
  • Can be used with many different classifiers
  • Improves classification accuracy
  • Commonly used in many areas
  • Simple to implement
  • Not prone to overfitting
a brief history
A Brief History

Resampling for estimating statistic

  • Bootstrapping
  • Bagging
  • Boosting (Schapire 1989)
  • Adaboost (Schapire 1995)

Resampling for classifier design

bootstrap estimation
Bootstrap Estimation
  • Repeatedly draw n samples from D
  • For each set of samples, estimate a statistic
  • The bootstrap estimate is the mean of the individual estimates
  • Used to estimate a statistic (parameter) and its variance
bagging aggregate bootstrapping
Bagging - Aggregate Bootstrapping
  • For i = 1 .. M
    • Draw n*<n samples from D with replacement
    • Learn classifier Ci
  • Final classifier is a vote of C1 .. CM
  • Increases classifier stability/reduces variance

D2

D1

D3

D

boosting schapire 1989
Boosting (Schapire 1989)
  • Consider creating three component classifiers for a two-category problem through boosting.
  • Randomly select n1 < nsamples from D without replacement to obtain D1
    • Train weak learner C1
  • Select n2 < nsamples from D with half of the samples misclassified by C1 toobtain D2
    • Train weak learner C2
  • Select all remainingsamples from D that C1 and C2 disagree on
    • Train weak learner C3
  • Final classifier is vote of weak learners

D

D3

D1

D2

-

+

+

-

adaboost adaptive boosting
Adaboost - Adaptive Boosting
  • Instead of resampling, uses training set re-weighting
    • Each training sample uses a weight to determine the probability of being selected for a training set.
  • AdaBoost is an algorithm for constructing a “strong” classifier as linear combination of “simple” “weak” classifier
  • Final classification based on weighted vote of weak classifiers
adaboost terminology
Adaboost Terminology
  • ht(x) … “weak” or basis classifier (Classifier = Learner = Hypothesis)
  • … “strong” or final classifier
  • Weak Classifier: < 50% error over any distribution
  • Strong Classifier: thresholded linear combination of weak classifier outputs
discrete adaboost algorithm
Discrete Adaboost Algorithm

Each training sample has a weight, which determines the probability of being selected for training the component classifier

reweighting
Reweighting

y * h(x) = 1

y * h(x) = -1

reweighting15
Reweighting

In this way, AdaBoost “focused on” the informative or “difficult” examples.

reweighting16
Reweighting

In this way, AdaBoost “focused on” the informative or “difficult” examples.

pros and cons of adaboost
Pros and cons of AdaBoost

Advantages

  • Very simple to implement
  • Does feature selection resulting in relatively simple classifier
  • Fairly good generalization

Disadvantages

  • Suboptimal solution
  • Sensitive to noisy data and outliers
references
References
  • Duda, Hart, ect – Pattern Classification
  • Freund – “An adaptive version of the boost by majority algorithm”
  • Freund – “Experiments with a new boosting algorithm”
  • Freund, Schapire – “A decision-theoretic generalization of on-line learning and an application to boosting”
  • Friedman, Hastie, etc – “Additive Logistic Regression: A Statistical View of Boosting”
  • Jin, Liu, etc (CMU) – “A New Boosting Algorithm Using Input-Dependent Regularizer”
  • Li, Zhang, etc – “Floatboost Learning for Classification”
  • Opitz, Maclin – “Popular Ensemble Methods: An Empirical Study”
  • Ratsch, Warmuth – “Efficient Margin Maximization with Boosting”
  • Schapire, Freund, etc – “Boosting the Margin: A New Explanation for the Effectiveness of Voting Methods”
  • Schapire, Singer – “Improved Boosting Algorithms Using Confidence-Weighted Predictions”
  • Schapire – “The Boosting Approach to Machine Learning: An overview”
  • Zhang, Li, etc – “Multi-view Face Detection with Floatboost”
appendix
Appendix
  • Bound on training error
  • Adaboost Variants
adaboost variants proposed by friedman
Adaboost Variants Proposed By Friedman
  • LogitBoost
    • Solves
    • Requires care to avoid numerical problems
  • GentleBoost
    • Update is fm(x) = P(y=1 | x) – P(y=0 | x) instead of
      • Bounded [0 1]