1 / 23

Adaboost (Adaptive boosting)

Adaboost (Adaptive boosting). 2013-07-31 Jo Yeong -Jun. Schapire , Robert E., and Yoram Singer. "Improved boosting algorithms using confidence-rated predictions."  Machine learning  37.3 (1999): 297-336. C ontent. Introduction of Learning Classifier (discriminative model)

yori
Download Presentation

Adaboost (Adaptive boosting)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Adaboost(Adaptive boosting) 2013-07-31 Jo Yeong-Jun Schapire, Robert E., and Yoram Singer. "Improved boosting algorithms using confidence-rated predictions." Machine learning 37.3 (1999): 297-336.

  2. Content • Introduction of Learning Classifier(discriminative model) • Adaboost (Adaptive boosting)

  3. Introduction of Learning Classifier

  4. Introduction of classifier • How to classify fish? • 목적 배스와 연어를 구분 할 수 있는 분류기(Classifier)를 설계 배스 연어 Bass! 분류기 input

  5. Introduction of classifier • 어떤 정보를 이용하여 분류를 하는가? B G R Image raw data • 너무 큰 데이터 • 쓸데 없는 정보 포함

  6. Introduction of classifier • 어떤 정보를 가지고 분류를 하는가? • 분류를 위한 의미 있는 특징 정보 추출 (Feature Extraction) • 추출된 특징 정보는 벡터로 표현 대상에 따라 Label 부여 B G R Image raw data • 너무 큰 데이터 • 쓸데 없는 정보 포함 Width : 8 Brightness : 2 Width : 7 Brightness : 9

  7. Introduction of classifier • 무엇을 가지고 학습을 하는가? • 학습을 위해 수집한 Training samples. Feature Extraction 배스 연어 Width 10 5 Brightness 10 5

  8. Introduction of classifier • 무엇을 학습하는가? TEST 과정 • Training samples 을 잘 분류 할 수 있는 line • 일반적으로 hyperplane이라고 함 분류 방법 Bass! h(x) 분류기 일반화 Width 10 5 bias input weight Brightness 10 5

  9. Introduction of classifier • 어떻게 학습하는가? • Correctly classified • Miss classified 10 • Loss function • Cost function 5 10 5

  10. Introduction of classifier • 요약 Training Find Feature Extraction Training samples Test 10 10 Feature Extraction h(x) = -2.1 Decision 5 5 input 10 10 5 5

  11. Adaboost(AdaptiveBoosting)

  12. Adaboost • Introduction • 1995년 Schapire가 adaboost를 제안함 • Error rate가 50% 이하인 weak classifier들의 weighted combination 으로 최종 strong classifier 생성 Strong classifier Weight Weak classifier

  13. Adaboost • Example Error: 1 Error: 2 Error: 3 3 5 14 6 5 4 2 9 4 4 4 12 6.5 0.5 3 3 2 3 3 8 8 4 3 -2.5 1.5 1 0 1 0.5 1 1 0.5 1 1 -0.5 -2 -3 -0.5 -0.5 -0.5 -1.5 0.5 -1 -1 -0.5 2 -2 -0.5 -2 -3 -2 1 -1 -10 -4 -5 -2 -3 -4 -9 -13 -4 -4 -5 -11 -1 -5 -5 Combination Error: 0

  14. Adaboost • Main issue 1. 선택 방법 • 매 step마다 classifying error를 최소화 하는 Weak classifier 선택 • 각각의 sample은 adaptive weight를 가짐 • Weight는 t시점의 classifying 결과에 따라 업데이트 Error ! → Weight 상승 Error ! → Weight 상승 Error가 가장 작은 Error가 가장 작은 t=1, t=2 … t= T

  15. Adaboost • Main issue 2. 선택 방법 • 각 step의 weak classifier의 classifying error에 반비례 t=1, t=2 … t= T

  16. Adaboost • Finding ,: step t 에서의 cost function 최소화

  17. Adaboost

  18. Adaboost • Find t=1, t=2

  19. Adaboost • Find =0.5 1

  20. Adaboost • Update 1 t=1, t=2

  21. Adaboost • 알고리즘

  22. Adaboost • 정리 • 50% 이상의 검출 성능을 가지는 weak classifiers의weighted combination으로 strong classifier를 설계 • Advantages • Very simple to implement • Fairly good generalization • Disadvantages • Suboptimal solution • Sensitive to noisy data and outliers

  23. Q & A

More Related