1 / 24

Feature Selection using Mutual Information

Feature Selection using Mutual Information. SYDE 676 Course Project Eric Hui November 28, 2002. Outline. Introduction … prostate cancer project Definition of ROI and Features Estimation of PDFs … using Parzen Density Estimation Feature Selection … using MI Based Feature Selection

piper
Download Presentation

Feature Selection using Mutual Information

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Feature Selectionusing Mutual Information SYDE 676 Course Project Eric Hui November 28, 2002

  2. Outline • Introduction … prostate cancer project • Definition of ROI and Features • Estimation of PDFs … using Parzen Density Estimation • Feature Selection … using MI Based Feature Selection • Evaluation of Selection … using Generalized Divergence • Conclusions

  3. Ultrasound Image of Prostate

  4. Prostate Outline

  5. “Guesstimated” Cancerous Region

  6. Regions of Interest (ROI)

  7. X 0 Features as Mapping Functions • Mapping from image space to feature space…

  8. Histogram Bins bad estimation with limited data available! Parzen Density Est. reasonable approximation with limited data. X X X 0 0 0 Parzen Density Estimation

  9. Gray-Level Difference Matrix (GLDM) Contrast Mean Entropy Inverse Difference Moment (IDM) Angular Second Moment (ASM) Fractal Dimension FD Linearized PowerSpectrum Slope Y-Intercept Features

  10. P(X|C=Cancerous),P(X|C=Benign), and P(X)

  11. Entropy and Mutual Information • Mutual Information I(C;X) measures the degree of interdependence between X and C. • Entropy H(C) measures the degree of uncertainty of C. • I(X;C) = H(C) – H(C|X). • I(X;C) ≤ H(C) is the upper bound.

  12. Results:Mutual Information I(C;X)

  13. Feature Images - GLDM

  14. Feature Images – Fractal Dim.

  15. Feature Images - PSD

  16. Interdependence between Features • Expensive to compute all features. • Some features might be similar to each other. • Thus, need to measure the interdependence between features: I(Xi; Xj)

  17. Results:Interdependence between Features

  18. Mutual Information BasedFeature Selection (MIFS) • Select first feature with highest I(C;X). • Select next feature with highest: • Repeat until a desired number of features are selected.

  19. Mutual Information BasedFeature Selection (MIFS) • This method takes into account both: • the interdependence between class and features, and • the interdependence between selected features. • The parameter β controls the amount of interdependence between selected features.

  20. {X1, X2, X3,…, X8} β = 1 β = 0 β = 0.5 S = {X2, X7} S = {X2, X4} S = {X2, X3} Varying β in MIFS

  21. Generalized Divergence J • If the features are “biased” towards a class, J is large. • A good set of features should have small J.

  22. Results:J with respect to β • First feature selected: GLDM ASM • Second feature selected: …

  23. X1 minimize X2 maximize C XN {X1, X2, X3,…, X8} β = 1 β = 0 β = 0.5 S = {X2, X7} S = {X2, X4} S = {X2, X3} Conclusions • Mutual Info. Based Feature Selection (MIFS): • Generalized Divergence:

  24. Questions and Comments • …

More Related