1 / 52

Chapter 10 The Support Vector Method For Estimating Indicator Functions

Chapter 10 The Support Vector Method For Estimating Indicator Functions. jpzhang@fudan.edu.cn Intelligent Information Processing Laboratory, Fudan University. Optimal hyperplane Remarkable statistical properties. Construct a new class of learning machines. Support vector machines.

russ
Download Presentation

Chapter 10 The Support Vector Method For Estimating Indicator Functions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 10The Support Vector Method For Estimating Indicator Functions jpzhang@fudan.edu.cn Intelligent Information Processing Laboratory, Fudan University

  2. Optimal hyperplane • Remarkable statistical properties. • Construct a new class of learning machines. • Support vector machines.

  3. The optimal hyperplane • The optimal hyperplane for nonseparable sets • Statistical properties of the optimal hyperplane • Proof of the theorems • The Idea of the Support Vector Machine

  4. One More Approach to the Support Vector Method • Selection of SV Machine Using Bounds • Examples of SV Machines For Pattern Recognition • SV Method for transductive inference • Multiclass classification • Remarks on generalization of the SV method

  5. Properties • Objective function do not depend explicitly on the dimensionality of the vector x • Depend on the inner product of two vectors. • Allow us to construct separating hyperplanes in high-dimensional space.

  6. The optimal hyperplane for nonseparable sets

  7. The Basic Solution: Soft Margin Generalization

  8. Statistical properties of the optimal hyperplane

  9. Proof of the theorems • (略)

  10. The Idea of the Support Vector Machine • Support Vector Machine: • Maps the input vectors x into the high-dimensional feature space Z through nonlinear mapping, chosen a priori. • In this space, an optimal separating hyperplane is constructed.

  11. Problem • How to find a separating hyperplane that generalizes well. (conceptual problem) • Dimensionality is huge • Generalize well • How to treat such high-dimensional spaces computationally. (technical problem) • Curse of dimensionality

  12. Generalization in high-dimensional space • Conceptual • Optimal hyperplane • Generalization ability is high even if the feature space has a high dimensionality. • Technical • One does not need to consider the feature space in explicit form. • Calculate the inner products between support vectors and the vectors of the feature space.

  13. Mercer Theorem

  14. Constructing SV Machines

  15. One More Approach to the Support Vector Method • Minimizing the Number of Support Vectors • Generalization for the Nonseparable Case • Linear Optimization Method for SV Machines

  16. Minimizing the Number of Support Vectors • The optimal hyperplane has an expansion on the support vectors • If a method of constructing the hyperplane has a unique solution • then the generalization ability of the constructed hyperplane depends on the number of support vectors.

  17. Selection of SV Machine Using Bounds

  18. Examples of SV Machines For Pattern Recognition

  19. SV Method for transductive inference

More Related