1 / 21

Kernel Methods Part 2

Kernel Methods Part 2. Bing Han June 26, 2008. Local Likelihood. Logistic Regression. Logistic Regression. After a simple calculation, we get We denote the probabilities Logistic regression models are usually fit by maximum likelihood. Local Likelihood.

Download Presentation

Kernel Methods Part 2

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Kernel MethodsPart 2 Bing Han June 26, 2008

  2. Local Likelihood • Logistic Regression

  3. Logistic Regression • After a simple calculation, we get • We denote the probabilities • Logistic regression models are usually fit by maximum likelihood

  4. Local Likelihood • The data has feature xi and classes {1,2,…,J} • The linear model is

  5. Local Likelihood • Local logistic regression • The local log-likelihood for this J class model

  6. Kernel Density Estimation

  7. Kernel Density Estimation • We have a random sample x1, x2, …,xN, we want to estimate probability density • A natural local estimate • Smooth Pazen estimate

  8. Kernel Density Estimation • A popular choice is Gaussian Kernel • A natural generalization of the Gaussian density estimate by the Gaussian product kernel

  9. Kernel Density Classification • Density estimates • Estimates of class priors • By Bayes’ theorem

  10. Kernel Density Classification

  11. Naïve Bayes Classifier • Assume given a class G=j, the features Xk are independent

  12. Naïve Bayes Classifier • A generalized additive model

  13. Similar to logistic regression

  14. Radial Basis Functions • Functions can be represented as expansions in basis functions • Radial basis functions treat kernel functions as basis functions. This lead to model

  15. Method of learning parameters • Optimize the sum-of squares with respect to all the parameters:

  16. Radial Basis Functions • Reduce the parameter set and assume a constant value for it will produce an undesirable effect. • Renormalized radial basis functions

  17. Radial Basis Functions

  18. Mixture models • Gaussian mixture model for density estimation • In general, mixture models can use any component densities. The Gaussian mixture model is the most popular.

  19. Mixture models • If , Radial basis expansion • If , kernel density estimate • Where

  20. Mixture models • The parameter are usually fit by maximum likelihood, such as EM algorithm • The mixture model also provides an estimate of the probability that observation i belong to component m

  21. Questions?

More Related