Boosting. LING 572 Fei Xia 02/01/06. Outline. Basic concepts Theoretical validity Case study: POS tagging Summary. Basic concepts. Overview of boosting. Introduced by Schapire and Freund in 1990s. “Boosting”: convert a weak learning algorithm into a strong one.
At round t, example xi has the weight Dt(i).
Error rate on a set of 27 benchmark problems
Training error is defined to be
It can be proved that training error
Training error drops exponentially fast.
T: the number of rounds of boosting
m: the size of the sample
d: VC-dimension of the base classifier space
h(x) = p1 if Φ(x) is true, h(x)=p0 o.w.
choose ht that minimizes Zt.
Choose the one with min Zt.
“Gentle AdaBoost”, “BrownBoost”