kernel methods part 2 l.
Skip this Video
Loading SlideShow in 5 Seconds..
Kernel Methods Part 2 PowerPoint Presentation
Download Presentation
Kernel Methods Part 2

Loading in 2 Seconds...

play fullscreen
1 / 21

Kernel Methods Part 2 - PowerPoint PPT Presentation

  • Uploaded on

Kernel Methods Part 2. Bing Han June 26, 2008. Local Likelihood. Logistic Regression. Logistic Regression. After a simple calculation, we get We denote the probabilities Logistic regression models are usually fit by maximum likelihood. Local Likelihood.

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

PowerPoint Slideshow about 'Kernel Methods Part 2' - sadie

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
kernel methods part 2

Kernel MethodsPart 2

Bing Han

June 26, 2008

local likelihood
Local Likelihood
  • Logistic Regression
logistic regression
Logistic Regression
  • After a simple calculation, we get
  • We denote the probabilities
  • Logistic regression models are usually fit by maximum likelihood
local likelihood4
Local Likelihood
  • The data has feature xi and classes {1,2,…,J}
  • The linear model is
local likelihood5
Local Likelihood
  • Local logistic regression
  • The local log-likelihood for this J class model
kernel density estimation7
Kernel Density Estimation
  • We have a random sample x1, x2, …,xN, we want to estimate probability density
  • A natural local estimate
  • Smooth Pazen estimate
kernel density estimation8
Kernel Density Estimation
  • A popular choice is Gaussian Kernel
  • A natural generalization of the Gaussian density estimate by the Gaussian product kernel
kernel density classification
Kernel Density Classification
  • Density estimates
  • Estimates of class priors
  • By Bayes’ theorem
na ve bayes classifier
Naïve Bayes Classifier
  • Assume given a class G=j, the features Xk are independent
na ve bayes classifier12
Naïve Bayes Classifier
  • A generalized additive model
radial basis functions
Radial Basis Functions
  • Functions can be represented as expansions in basis functions
  • Radial basis functions treat kernel functions as basis functions. This lead to model
method of learning parameters
Method of learning parameters
  • Optimize the sum-of squares with respect to all the parameters:
radial basis functions16
Radial Basis Functions
  • Reduce the parameter set and assume a constant value for it will produce an undesirable effect.
  • Renormalized radial basis functions
mixture models
Mixture models
  • Gaussian mixture model for density estimation
  • In general, mixture models can use any component densities. The Gaussian mixture model is the most popular.
mixture models19
Mixture models
  • If , Radial basis expansion
  • If , kernel density estimate
  • Where
mixture models20
Mixture models
  • The parameter are usually fit by maximum likelihood, such as EM algorithm
  • The mixture model also provides an estimate of the probability that observation i belong to component m