1 / 27

Support Vector Machines (part 1)

Support Vector Machines (part 1). Plan of the lecture. Problem of classification SVM for solving linear problems training classification Application of convolution kernels. Bi bli ography. Corrina Cortez, Vladimir Vapnik Support-Vector Networks. Classification problem.

jadyn
Download Presentation

Support Vector Machines (part 1)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Support Vector Machines (part 1) Face Recognition & Biometric Systsems

  2. Plan of the lecture • Problem of classification • SVM for solving linear problems • training • classification • Application of convolution kernels Face Recognition & Biometric Systsems

  3. Bibliography Corrina Cortez, Vladimir Vapnik Support-Vector Networks Face Recognition & Biometric Systsems

  4. Classification problem • Aim: classification of an element to one of defined classes • Two stages: • training • classification of samples • Available solutions: • Artificial Neural Networks • Support Vector Machines • other classifiers Face Recognition & Biometric Systsems

  5. Classification problem • Training set - requirements: • classified • representative • Training process: • aims at finding general rules • a risk of overfitting to the training set (especially when it is not representative) Face Recognition & Biometric Systsems

  6. Classification problem • Classification of samples: • must be preceded by the training stage • applies rules derived from the training • Number of classes: • SVM solves two-class problems • it is possible to solve multi-class problems basing on two-class problems Face Recognition & Biometric Systsems

  7. Classification problem • Linearly separable Face Recognition & Biometric Systsems

  8. Classification problem • Non-linearly separable Face Recognition & Biometric Systsems

  9. Classification problem • Training with error (soft margin) Face Recognition & Biometric Systsems

  10. Classification problem • Margin maximisation Face Recognition & Biometric Systsems

  11. Linear separability • Data set: (y1,x1),...,(yl,xl), yi{-1,1} • Vector w, scalar value b: w•xi + b  1 for yi = 1 w•xi + b  -1 for yi = -1 hence yi (w•xi + b)  1 • The condition must be fulfilled for the whole data set Face Recognition & Biometric Systsems

  12. SVM – training • SVM solves linear separable two-class problems • other cases transformed to the basic problem • Optimal hyperplane • margin between samples of two classes • margin maximisation Face Recognition & Biometric Systsems

  13. SVM – training • Optimal hyperplane: w0 •x + b0 = 0 • 2D example – hyperplane is a line • Margin width (without b): Face Recognition & Biometric Systsems

  14. SVM – training • Optimal width: • Maximisation of , minimisation of w0 •w0 • Limitation: yi (w•xi + b)  1 Face Recognition & Biometric Systsems

  15. SVM – training • Margin: • Optimal hyperplane: • yi – class identifier • i – Lagrange multipliers • A problem: how to find i? Face Recognition & Biometric Systsems

  16. SVM – training • Function maximisation: 1 – unitary vector (l – dimensional) D – l x l matrix: Face Recognition & Biometric Systsems

  17. SVM – training • Optimisation limits: • Optimisation based on the gradient method Face Recognition & Biometric Systsems

  18. SVM – training • Lagrange multipliers : • non-zero values for support vectors • equal zero for other vectors (majority) • Training set after the training: • support vectors (a small subset of the training set) •  coefficients for every vector Face Recognition & Biometric Systsems

  19. SVM – classification • Calculate y for a vector which is to be classified: xr, xs – support vectors from both classes • Classification decision Face Recognition & Biometric Systsems

  20. SVM – limitations • SVM conditions: • solves two-class problem • linear separability of data • A XOR problem: Face Recognition & Biometric Systsems

  21. SVM – limitations • Possibilities of enhancement: • SVM for non-linear data – too complicated calculations • transformation of the data, so that they are linearly separable • Mapping into higher dimension • example of XOR in 2D mapped into 3D Face Recognition & Biometric Systsems

  22. Convolution kernels • Function: • Mapping into higher dimension: x   (x) • Calculations use scalar product of vectors, not the vectors themselves • Kernels of convolution may be used instead of scalar products • No need to find  function Face Recognition & Biometric Systsems

  23. Convolution kernels • Training with convolution kernels Face Recognition & Biometric Systsems

  24. Convolution kernels • Classification with convolution kernels xr, xs – support vectors from both classes Face Recognition & Biometric Systsems

  25. Convolution kernels • Linear • Polynomial • RBF (radial basis functions) Face Recognition & Biometric Systsems

  26. Summary • Classifiers • Basic problem: • two-class linear separable data set • solved by the SVM • Enhancement • convolution kernels – SVM for non-linear separable data Face Recognition & Biometric Systsems

  27. Thank you for your attention! • Next week Support Vector Machines – continued... • multi-class cases • soft margin training • applications to face recognition Face Recognition & Biometric Systsems

More Related