1 / 18

Pattern Recognition Concepts

Pattern Recognition Concepts. Chapter 4: Shapiro and Stockman How should objects be represented? Algorithms for recognition/matching * nearest neighbors * decision tree * decision functions * artificial neural networks How should learning/training be done?.

doli
Download Presentation

Pattern Recognition Concepts

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSE803 Fall 2013 Pattern Recognition Concepts • Chapter 4: Shapiro and Stockman • How should objects be represented? • Algorithms for recognition/matching * nearest neighbors * decision tree * decision functions * artificial neural networks • How should learning/training be done?

  2. CSE803 Fall 2013 Feature Vector Representation • X=[x1, x2, … , xn], each xj a real number • Xj may be object measurement • Xj may be count of object parts • Example: object rep. [#holes, Area, moments, ]

  3. CSE803 Fall 2013 Possible features for char rec.

  4. CSE803 Fall 2013 Some Terminology • Classes: set of m known classes of objects (a) might have known description for each (b) might have set of samples for each • Reject Class: a generic class for objects not in any of the designated known classes • Classifier: Assigns object to a class based on features

  5. CSE803 Fall 2013 Classification paradigms

  6. CSE803 Fall 2013 Discriminant functions • Functions f(x, K) perform some computation on feature vector x • Knowledge K from training or programming is used • Final stage determines class

  7. CSE803 Fall 2013 Decision-Tree Classifier • Uses subsets of features in seq. • Feature extraction may be interleaved with classification decisions • Can be easy to design and efficient in execution

  8. CSE803 Fall 2013 Decision Trees #holes 0 2 1 moment of inertia #strokes #strokes  t < t 1 0 best axis direction #strokes 0 1 4 2 0 90 60 - / 1 x w 0 A 8 B

  9. CSE803 Fall 2013 Classification using nearest class mean • Compute the Euclidean distance between feature vector X and the mean of each class. • Choose closest class, if close enough (reject otherwise) • Low error rate at left

  10. CSE803 Fall 2013 Nearest mean might yield poor results with complex structure • Class 2 has two modes • If modes are detected, two subclass mean vectors can be used

  11. CSE803 Fall 2013 Scaling coordinates by std dev

  12. CSE803 Fall 2013 Another problem for nearest mean classification • If unscaled, object X is equidistant from each class mean • With scaling X closer to left distribution • Coordinate axes not natural for this data • 1D discrimination possible with PCA

  13. CSE803 Fall 2013 Receiver Operating Curve ROC • Plots correct detection rate versus false alarm rate • Generally, false alarms go up with attempts to detect higher percentages of known objects

  14. CSE803 Fall 2013 Confusion matrix shows empirical performance

  15. CSE803 Fall 2013 Bayesian decision-making

  16. CSE803 Fall 2013 Normal distribution • 0 mean and unit std deviation • Table enables us to fit histograms and represent them simply • New observation of variable x can then be translated into probability

  17. CSE803 Fall 2013 Cherry with bruise • Intensities at about 750 nanometers wavelength • Some overlap caused by cherry surface turning away

  18. CSE803 Fall 2013 Parametric models

More Related