1 / 24

Polyhedral Classifier for Target Detection A Case Study: Colorectal Cancer

This case study explores the use of a polyhedral classifier for detecting colorectal cancer, specifically polyps, using computer-aided diagnosis techniques. The study discusses the challenges of classifying multi-mode data and proposes a viable solution using linear classifiers and an AND framework. Experimental results show improved performance in automatic polyp detection.

samn
Download Presentation

Polyhedral Classifier for Target Detection A Case Study: Colorectal Cancer

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Polyhedral Classifier for Target DetectionA Case Study: Colorectal Cancer Murat Dundar, Matthias Wolf, Sarang Lakare, Marcos Salganicoff, Vikas C. Raykar Siemens Medical Solutions, Inc. USA Malvern, PA 19355

  2. Computer Aided Diagnosis (CAD) for Colon Cancer • Identify suspicious regions (candidates) • Extract features for each candidate • Classify candidates as a polyp or non-polyp

  3. Multi-mode nature of CAD data • The only ground truth available is the location of the polyp. • All other candidates that are not pointing to a known polyp are pooled into the negative class. • Variation among the different negatives is large.

  4. A CAD Example: Colorectal CancerPolyps vs. common false positives Stool Sessile polyp Noise Rectal tube Pedunculated polyp Fold

  5. State-of-the-Art – Finite Mixture Models • Model class distribution by a mixture model, one mode for each subclass, then design a maximum a posteriori or maximum likelihood classifier • Too few positives, too many features with redundancy! Robust estimation of model parameters for positive class is very difficult, if not impractical

  6. State-of-the-Art – Discriminative Techniques • Pool all negative candidates into a single class and learn a binary classifier, i.e. polyps vs. negatives • A kernel-based discriminative technique (SVM, RVM, KFD) can yield nonlinear decision boundaries suitable for classifying multi-mode data. • Too few positive candidates, too many features with redundancy! Data can be easily overfit by a nonlinear classifier

  7. State-of-the-Art – One-Class Classifiers • Omits the negative class, learns a model with positive samples only. • Kernel-based and neural network implementation yield nonlinear decision boundaries suitable for classifying multi-mode data. • Like other nonlinear classifiers susceptible to overfitting

  8. State-of-the-art in a Nutshell • Linear classifiers • less prone to overfitting • not enough capacity to deal with multi-mode data • Finite mixture models • Parameter estimation is an issue! • Discriminative & One-class Classifiers • good capacity • more prone to overfitting

  9. A Viable solution • A series of linear classifiers one for each subclass of the negatives • More capacity than a linear classifier, yet less prone to overfitting than a nonlinear classifier • An unseen sample is classified as positive if all the classifier classifies it as positive

  10. Training Multiple Linear Classifiers • Train each classifier independently: Negative subclass k vs. Positives, for k=1,…,K. • Inefficient! Potentially excessive penalization due to a misclassified positive sample

  11. Proposed Approach • Optimize classifiers jointly • One classifier for each subclass of negative data • Objective function is penalized once due to a misclassified positive sample • Yields a polyhedral decision surface

  12. A Toy Example

  13. TP+ ξ ξ FP- Hyperplane Classifiers with Hinge Loss

  14. Polyhedral Classifier with AND Framework If the hinge loss = 0, the example is correctly classified, If the hinge loss > 0, the example is mis-classified Let be the hinge loss of i-th example induced by the classifier k i-th Positive example: -- “AND” i-th Negative example:

  15. Objective Function with the AND Framework Error on Negative Examples Error on Positive Examples Regularization to Control Complexity Convex Problem!

  16. Incomplete Ground Truth for Subclasses • AND algorithm assumes the subclass membership is known for all samples. Not Realistic! • Annotate a small portion of the negatives • identify potential subclasses • pool training samples for each subgroup. • Three different types of samples in the training data • Positives • Negatives with known and unknown subclass membership

  17. Objective Function with the AND-OR Framework Error on Negative Examples with known subclasses Error on Negative Examples with unknown subclasses, OR operation Error on Positive Examples AND operation Regularization to Control Complexity Not Convex!

  18. Alternating Optimization Iterative Algorithm Each iteration contains K steps, and each step optimizes a single classifier At the k-th step, Fix all classifiers (α’s) but the classifier k Minimize J(α1,…, αk ,… αK) for optimalαk

  19. Cascaded Design Candidates T1 T2 TK-1 TK …. TP 1 2 K F2 FK F1 rejected candidates

  20. Cascade Design with Sparse Linear Classifiers • Setting P(k)=| k | yields K sparse classifiers, each with varying number of non-zero coefficients • Run-time order does not change the outcome • Start with the classifier that has the least number of nonzero coefficients • Classify the sample, if negative reject, if positive pass it to the next classifier that requires computation of least number of additional features. Continue until all K classifiers are run

  21. Experiments – Automatic Polyp Detection Data 98 numerical image features are computed, out of 1249 negatives, 177 are annotated, 9 subclasses are identified

  22. ROC plots

  23. Run-time Performance 25 % gain in execution time over SVDD and RBF-SVM

  24. Conclusions • Polyhedral classifier for multi-mode data • AND framework when subclass information is fully available • AND-OR framework when subclass information is partially available • Cascade design as a by-product to speed-up online execution Thank you! Questions and Comments

More Related