Kernel Methods Part 2

1 / 21

# Kernel Methods Part 2 - PowerPoint PPT Presentation

Kernel Methods Part 2. Bing Han June 26, 2008. Local Likelihood. Logistic Regression. Logistic Regression. After a simple calculation, we get We denote the probabilities Logistic regression models are usually fit by maximum likelihood. Local Likelihood.

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

### Kernel MethodsPart 2

Bing Han

June 26, 2008

Local Likelihood
• Logistic Regression
Logistic Regression
• After a simple calculation, we get
• We denote the probabilities
• Logistic regression models are usually fit by maximum likelihood
Local Likelihood
• The data has feature xi and classes {1,2,…,J}
• The linear model is
Local Likelihood
• Local logistic regression
• The local log-likelihood for this J class model
Kernel Density Estimation
• We have a random sample x1, x2, …,xN, we want to estimate probability density
• A natural local estimate
• Smooth Pazen estimate
Kernel Density Estimation
• A popular choice is Gaussian Kernel
• A natural generalization of the Gaussian density estimate by the Gaussian product kernel
Kernel Density Classification
• Density estimates
• Estimates of class priors
• By Bayes’ theorem
Naïve Bayes Classifier
• Assume given a class G=j, the features Xk are independent
Naïve Bayes Classifier
• Functions can be represented as expansions in basis functions
• Radial basis functions treat kernel functions as basis functions. This lead to model
Method of learning parameters
• Optimize the sum-of squares with respect to all the parameters: