1 / 41

Lecture 5-6 Object Detection – Boosting

Lecture 5-6 Object Detection – Boosting . Tae- Kyun Kim. Face Detection Demo. Multiclass object detection [ Torralba et al PAMI 07]. Object Detection. From TUD dataset. Object Detection. From TUD dataset. Object Detection. From TUD dataset. Number of windows. x. …. # of scales.

lundy
Download Presentation

Lecture 5-6 Object Detection – Boosting

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lecture 5-6Object Detection – Boosting Tae-Kyun Kim

  2. Face Detection Demo

  3. Multiclass object detection[Torralba et al PAMI 07]

  4. Object Detection From TUD dataset

  5. Object Detection From TUD dataset

  6. Object Detection From TUD dataset

  7. Number of windows x … # of scales # of pixels Number of Windows: 747,666

  8. Time per window or raw pixels … …… dimension D Num of feature vectors: 747,666

  9. Time per window or raw pixels … …… dimension D Num of feature vectors: 747,666 In order to finish the task in 1 sec Neural Network? Nonlinear SVM? Time per window (or vector): 0.00000134 sec

  10. Examples of face detection From Viola, Jones, 2001

  11. More traditionally…Narrow down the search space • Integrating Visual Cues [Darrell et al IJCV 00] • Face pattern detection output (left). • Connected components recovered from stereo range data (mid). • Flesh hue regions from skin hue classification (right).

  12. Since about 2001… Boosting Simple Features [Viola &Jones 01] • Adaboost classification • Weak classifiers: Haar-basis like functions (45,396 in total) Strong classifier Weak classifier

  13. Introduction to Boosting Classifiers- AdaBoost

  14. Sorry for inconsistent notations…

  15. Boosting

  16. Boosting

  17. Boosting • Iteratively reweighting training samples. • Higher weights to previously misclassified samples. 50 rounds 2 rounds 3 rounds 4 rounds 5 rounds 1 round

  18. AdaBoost

  19. Boosting

  20. Boosting as an optimisation framework

  21. Minimising Exponential Error

  22. Existence of weak learners • Definition of a baseline learner • Data weights: • Set • Baseline classifier: for all x • Error is at most ½. • Each weak learner in Boosting is demanded s.t. → Error of the composite hypothesis goes to zero as boosting rounds increase [Duffy et al 00].

  23. Robust real-time object detector

  24. Boosting Simple Features [Viola and Jones CVPR 01] • Adaboost classification • Weak classifiers: Haar-basis like functions (45,396 in total) Strong classifier Weak classifier

  25. Evaluation (testing) From Viola, Jones, 2001

  26. Boosting Simple Features [Viola and Jones CVPR 01] • Integral image • A value at (x,y) is the sum of the pixel values above and to the left of (x,y). • The integral image can be computed in one pass over the original image.

  27. Boosting Simple Features [Viola and Jones CVPR 01] • Integral image • The sum of original image values within the rectangle can be computed: Sum = A-B-C+D • This provides the fast evaluation of Haar-basis like features

  28. Evaluation (testing) From Viola, Jones, 2001

  29. Boosting as a Tree-structured Classifier

  30. Boosting (very shallow network) • The strong classifier H as boosted decision stumps has a flat structure • Cf. Decision “ferns” has been shown to outperform “trees” [Zisserman et al, 07] [Fua et al, 07] x …… …… c0 c1 c0 c1 c0 c1 c0 c1 c0 c1 c0 c1

  31. Boosting -continued • Good generalisation by a flat structure • Fast evalution • Sequential optimisation A strong boosting classifier • Boosting Cascade [viola & Jones 04], Boosting chain [Xiao et al] • Very imbalanced tree • Speeds up for unbalanced binary problems • Hard to design

  32. A cascade of classifiers • The detection system requires good detection rate and extremely low false positive rates. • False positive rate and detection rate are f_i is the false positive rate of i-th classifier on the examples that get through to it. • The expected number of features evaluated is p_j is the proportion of windows input to i-th classifier.

  33. Demo video: Fast evaluation

  34. Object Detection by a Cascade of Classifiers Pictures from Romdhani et al. ICCV01

More Related