1 / 18

Support Vector Machines

Support Vector Machines. Presented By Sherwin Shaidaee. Papers. Vladimir N. Vapnik, “The Statistical Learning Theory”. Springer, 1998 Yunqiang Chen, Xiang Zhou, and Thomas S. Huang, University of Illinois, “ONE-CLASS SVM FOR LEARNING IN IMAGE RETRIEVAL”, 2001

dalia
Download Presentation

Support Vector Machines

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Support Vector Machines Presented By Sherwin Shaidaee

  2. Papers • Vladimir N. Vapnik, “The Statistical Learning Theory”. Springer, 1998 • Yunqiang Chen, Xiang Zhou, and Thomas S. Huang, University of Illinois, “ONE-CLASS SVM FOR LEARNING IN IMAGE RETRIEVAL”, 2001 • A. Ganapathiraju, Member, IEEE, J. Hamaker, Member, IEEE and J. Picone, Senior Member, IEEE, ”Applications of Support Vector Machines To Speech Recognition” , 2003

  3. Introduction to Support Vector Machines • SVMs are based on statistical learning theory; the aim being to solve only the problem of interest without solving a more difficult problem as an intermediate step • SVMs are based on the structural risk minimisation principle, which incorporates capacity control to prevent over-fitting

  4. Introduction to Support Vector Machines

  5. The Separable Case • Two-Class Classification • P: Positive N: Negative for Yi=+1,-1 • The support vector algorithm simply looks for separating hyperplane with largest margin. OR

  6. Convex Quadratic Problem Lagrangian for this problem: where are the Lagrange multipliers

  7. Convex Quadratic Problem Differentiation with respect to w & b:

  8. Support vectors • Lie closest to the separating hyperplane. • Optimal Weights: • Optimal Bias:

  9. Types of Support vectors (a) two-class linear (b) One-class (c) Non-linear Decision Function:

  10. Kernel feature Spaces • Feature space • Decision Function

  11. Kernel Function .

  12. Non-Separable Data

  13. Image Retrieval An Application for SVM • Relevance Feedback • Problem with small number of training samples and the high dimension of the feature space

  14. Image Retrieval • One-Class SVM • Estimate the distribution of the target images in the feature space without over-fitting to the user feedback.

  15. One-Class SVM – Decision Function (LOC-SVM)

  16. Non-linear Case Using Kernel (KOC-SVM)

  17. Experiment • Five Classes: • Airplanes, Cars, Horses, Eagles , Glasses • 100 images for each class • 10 images are randomly drawn as training sample. • Hit rates in the first 20 and 100 images are used as the performance measure.

  18. Results

More Related