1 / 10

Kernels

Kernels. CMPUT 466/551 Nilanjan Ray. Agenda. Kernel functions in SVM: A quick recapitulation Kernels in regression Kernels in k -nearest neighbor classifier Kernel function: a deeper understanding A case study. Kernel Functions: SVM. The dual cost function:. The non-linear classifier

hidi
Download Presentation

Kernels

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Kernels CMPUT 466/551 Nilanjan Ray

  2. Agenda • Kernel functions in SVM: A quick recapitulation • Kernels in regression • Kernels in k-nearest neighbor classifier • Kernel function: a deeper understanding • A case study

  3. Kernel Functions: SVM The dual cost function: The non-linear classifier in dual variables: The kernel function K is symmetric and positive (semi)definite by definition

  4. Input Space to Feature Space Picture taken from: Kernel methods for pattern analysis By Shawe-Taylor and Cristianini

  5. Input Space to Feature Space: Example

  6. Kernel Ridge Regression Consider the regression problem: fit the function to N data points Basis functions are non-linear in x Form the cost function: The solution is given by: where, Using the identity Ex. Prove this identity we have We have defined: Note that is the kernel matrix Finally the solution is given by The basis functions h have disappeared!

  7. Kernel k-Nearest Neighbor Classifier Consider the k-nn classification problem in the feature space. Basis functions are typically non-linear in x The Euclidean distance in the feature space can be written as follows: Once again, the basis functions h have disappeared! Note also that a kernel function essentially provides similarity between two points in the input space (opposite of distance measure!)

  8. The Kernel Architecture Picture taken from: Learning with kernels By Scholkopf and Smola

  9. Inside Kernels Picture taken from: Learning with kernels

  10. Inside Kernels… Given a point x in the input space, the function k(., x) is essentially function So, x is mapped into a function space (known as Reproducing kernel Hilbert space (RKHS) When we measure similarity of two points x and y in the input space, we are actually measuring the similarity between two functions k(., x) and k(., y) in RKHS. How is this similarity defined in in RKHS? By a (defined) inner product in RKHS: Reproducing property All the solutions so far we obtained has the form: This means these solutions are functions in RKHS. Functions in RKHS are nicer: they are smooth, they have finite-dimensional representation. Good for computations and practical solutions See “Learning with kernels” for more; Must read G. Wahba’s work to learn more on RKHS vis-à-vis M/C Learning.

More Related