1 / 5

50 likes | 205 Views

Pattern Classification via Density Estimation. ECE 539 Jin Woo Yim 12/14/2010. Motivation:

Download Presentation
## Pattern Classification via Density Estimation

**An Image/Link below is provided (as is) to download presentation**
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.
Content is provided to you AS IS for your information and personal use only.
Download presentation by click this link.
While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

**Pattern Classification via Density Estimation**ECE 539 Jin Woo Yim 12/14/2010**Motivation:**In class, we discussed MAP classifier and ML classifier along with the subject of pattern classification. In practice, the underlying densities of training sets are often unknown. We discussed many approaches to deal with this situation. However, the most fundamental approach would be estimating densities of training sets. Then we could use, for example when 2 classes exist, binary hypothesis testing.**How to estimate density?**Histogram Density Estimator: The most intuitive way. (e.g. feat. dim. = 2, bins =16) e.g. Training data = 100, number of training samples in ith bin: (3,2) = 15.**How good is this estimation?**It will be tested using Mean Squared Error (Expected value of L2 norm difference between real density and estimated density). Mean Squared Error can be divided into bias and variance terms. These terms will be affected by the number of feature space dimension, the number of training samples, and the number of bins.**Comparison b/w Theory and Practice**Write a matlab code to estimate Gaussian density(and possibly more densities) with n Gaussian training samples. Observe how estimated density performs with respect to increasing number of samples and number of feature spaces. Compare the simulated results with the theoretical results.

More Related