1 / 35

An introduction to automatic classification Andy French February 2010

An introduction to automatic classification Andy French February 2010. Target recognition “at a glance”

zonta
Download Presentation

An introduction to automatic classification Andy French February 2010

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. An introduction to automatic classification Andy French February 2010

  2. Target recognition “at a glance” “One of the most potent of human skills is the ability to rapidly recognize and classify environmental stimuli, often when such signals are severely corrupted. Of this toolkit of sensors and processing, the method of visual facial recognition is perhaps the most impressive. Typically, a successful recognition (i.e. a name attached) will occur in 120 ms, with cruder classifications (for example classification of a species group from a background) in as little as 50ms.” Can a machine be built with this level of performance?

  3. This ‘introduction’ is but a glance at a large and active research area! For a much more complete introduction see Duda, R.O., Hart, P.E, Stork, D.G., Pattern Classification. 2nd Edition. John Wiley & Sons Inc. 2001. Webb. A., Statistical Pattern Recognition. 2nd Edition. John Wiley & Sons Ltd. 2002. Let us start in a similar fashion to Duda, with a practical example of a classification problem…

  4. A problem of cat classification…. There are many cats out there How can I be sure to let the right one in…? Dorset Big Cat

  5. Mechanized Entrance Test Of Cats Roar Hairyness

  6. Feature measurements Class label Classifier discriminant functions

  7. Decision boundary Roar Hairyness

  8. Radar target classification NCTR with MESAR2 Start Finish

  9. Feature extraction: Radar length Inbound Falcon jet aircraft Length threshold

  10. Doppler processing: Jet Engine Modulation (JEM) JEM lines

  11. (Aside) Doppler processing: Propeller modulation Dash8 six blade propeller aircraft Doppler spectrum for 32 pulse, 32 frequency step waveform E 2.5kHz PRF

  12. Feature extraction: Doppler spectra Inbound Boeing 777 jet aircraft

  13. Aim: Design a classifier based upon measured feature statistics Example #1: a parametric (Gaussian) classifier i.e. feature data is assumed to adopt a Gaussian distribution, characterized by mean and covariance parameters

  14. Gaussian distribution of feature vectors x,given class wi mean covariance Apply Bayes Theorem to determine the posterior probability, which will be proportional to our desired discriminant function posterior prior likelihood Voila! The Gaussian classifier. But how do we compute the mean and covariance from training data? m

  15. M is the number of features Sample mean Sample covariance

  16. (Bayesian) Friedman regularized discriminant function Bayesian FRD classifier Computed for TRAINING vectors t Sample within class covariance Sample between class covariance

  17. Example #2: k-meansnon-parametric classifier

  18. k-means classification does not assume an a-priori feature distribution. Instead one uses the k-means clustering algorithm to automatically group training data into K clusters (binary) cluster membership matrix U. Start with random assignments! Centre of cluster hyperspheres Training data for class i Distance between training data and cluster centres Update membership U based on nearest hypersphere centre for each training feature vector Radii of cluster hyperspheres

  19. K-means classifier discriminant function Alternative “Fuzzy” membership matrix

  20. Radar example: Gaussian & Fuzzy logic classification methods employed Define the Membership function

  21. Radar target classification: truth assignments

  22. Radar Length feature based classification

  23. The Confusion matrix and its ‘off-diagonal-extent’ ?

  24. Prop, JEM or No-Non-Skin-Doppler (NNSD) classification “Doppler fraction” feature Classes are visually separable

  25. Confusion matrix for Prop, JEM or No-Non-Skin-Doppler (NNSD) classification

  26. Four lengths classes: VS, S, L, VL Classification performance vs length thresh & Q

  27. Classification performance vs dfrac thresh & P

  28. Classification based on combined length & dfrac features

  29. Classification performance vs frequency jitter Maximum P and Q used for all waveforms

  30. Any questions?

More Related