1 / 13

Robust supervised image classifiers by spatial AdaBoost based on robust loss functions

Robust supervised image classifiers by spatial AdaBoost based on robust loss functions. Ryuei Nishii and Shinto Eguchi Proc. Of SPIE Vol. 5982 59820D-2. terminology. Supervised image classification A prior knowledge of your image area is required. Loss function

Download Presentation

Robust supervised image classifiers by spatial AdaBoost based on robust loss functions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Robust supervised image classifiers by spatial AdaBoost based on robust loss functions Ryuei Nishiiand Shinto Eguchi Proc. Of SPIE Vol. 5982 59820D-2

  2. terminology • Supervised image classification • A prior knowledge of your image area is required. • Loss function • A function that maps an event onto a real number representing the cost or regret associated with the event. • Contextual • Related to neighborhoods of the pixel • Markov Random Fields(MRF)

  3. Real Adaboost with multiclass • Adaboost combines week classifiers into a weighted voting machine. • possible categories . category label • Let be a training region with n pixels. • Let be m-dimensional feature vector. is its true label.

  4. Loss function • : classification function of feature vector • a label in the label set • Exponential loss function

  5. Empirical risks • The average of the loss functions evaluated by the training data set • Exponential risk • AdaBoost aim to minimize the exponential risk .

  6. Real AdaBoost procedure • A weak classifier and coefficient which minimize the empirical risk ,says and . • Consider . Then find the optimal class and the coefficient which minimize the empirical risk, says and . • This is repeated T-times. Final classifier

  7. Neighborhoods and contextual classifiers • We add contextual classifiers to the set of noncontextual classifiers. • Define a subset of . • First-order neighbor • Second-order neighbor

  8. Neighborhoods and contextual classifiers

  9. Contextual classifiers • Average of posteriors probabilities in the subset • Noncontextual classification • The importance of the posteriors

  10. Improvement • The classifier give a poor result for some data when exponential loss puts too big penalty for misclassified data. • Logit loss function gives a linear penalty for misclassified data approximately.

  11. Example:two-category case(1) • For the two-category (g=2), put the label set {1,-1} • True label • If , then is classified into the label 1,otherwise into -1. • If , the vector is misclassified.

  12. Example:two-category case(2) • Loss function

  13. Conclusion • To substitute the exponential loss function to more robust loss function, e.g., the logit loss function.

More Related