1 / 33

Lecture 20 Object recognition I

Lecture 20 Object recognition I. Pattern and pattern classes Classifiers based on Bayes Decision Theory Recognition based on decision-theoretical methods Optimum statistical classifiers Pattern recognition with Matlab. Patterns and Pattern classes.

emorris
Download Presentation

Lecture 20 Object recognition I

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lecture 20 Object recognition I Pattern and pattern classes Classifiers based on Bayes Decision Theory Recognition based on decision-theoretical methods Optimum statistical classifiers Pattern recognition with Matlab

  2. Patterns and Pattern classes • A pattern is an arrangement of descriptors (features) • Three commonly used pattern arrangements • Vectors • Strings • Trees • A pattern class is a family of patterns that share some common properties. • Pattern recognition is to assign a given pattern to its respective class.

  3. Example 1 Represent flow petals by features width and length Then three types of iris flowers are in different pattern classes

  4. Example 2 Use signature as pattern vector

  5. Example 3 Represent pattern by string

  6. Example 4 Represent pattern by trees

  7. 2. Classifier based on Baysian Decision Theory • Fundamental statistical approach • Assumes relevant probabilities are known, compute the probability of the event observed, then make optimal decisions • Bayes’ Theorem: • Example: Suppose at Laurier, 50% are girl students, 30% are science students, among science students, 20% are girl students. If one meet a girl student at Laurier, what is the probability that she is a science student. B – girl students, A – science students. Then

  8. Bayes theory Given x ∈ Rl and a set classes, ωi , i = 1, 2, . . . , c, the Bayes theory states that where P(ωi) is the a priori probability of class ωi ; i = 1, 2, . . . , c, P(ωi |x) is the a posteriori probability of class ωi given the value of x; p(x) is the probability density function (pdf ) of x; and p(x| ωi), i = 1 = 2, . . . , c, is the class conditional pdf of x given ωi (sometimes called the likelihood of ωi with respect to x).

  9. Bayes classifier Let x ≡ [x(1), x(2), . . . , x(l)]T ∈ Rl be its corresponding feature vector, which results from some measurements. Also, we let the number of possible classes be equal to c, that is, ω1, . . . , ωc. Bayes decision theory: x is assigned to the class ωi if

  10. Multidimensional Gaussian PDF

  11. Example Consider a 2-class classification task in the 2-dimensional space, where the data in both classes, ω1, ω2, are distributed according to the Gaussian distributions N(m1,S1) and N(m2,S2), respectively. Let Assuming that, Classify x = [1.8, 1.8]T into ω1 or ω2 .

  12. Solution P1=0.5; P2=0.5; m1=[1 1]'; m2=[3 3]'; S=eye(2); x=[1.8 1.8]'; p1=P1*comp_gauss_dens_val(m1,S,x); p2=P2*comp_gauss_dens_val(m2,S,x); The resulting values p1 = 0.042, p2 = 0.0189 According to the Bayesian classifier, x is assigned to ω1

  13. Decision-theoretic methods Decision (discriminate) functions Decision boundary

  14. Minimum distance classifier

  15. Example

  16. Minimum Mahalanobis distance classifiers

  17. Example x=[0.1 0.5 0.1]'; m1=[0 0 0]'; m2=[0.5 0.5 0.5]'; m=[m1 m2]; z1=euclidean_classifier(m,x) x=[0.1 0.5 0.1]'; m1=[0 0 0]'; m2=[0.5 0.5 0.5]'; m=[m1 m2]; S=[0.8 0.01 0.01;0.01 0.2 0.01; 0.01 0.01 0.2]; z2=mahalanobis_classifier(m,S,x); z1 = 1 < z2 = 2 x is classified to w1

  18. 4. Matching by correlation Given a template w(s,t) (or mask), i.e. an m × n matrix, find the a sub m × n matrix in f(x,y) such that it best matches w, i.e. with largest correlation.

  19. Correlation theorem [M, N] = size(f); f = fft2(f); w = conj(fft2(w, M, N)); g = real(ifft2(w.*f));

  20. Example

  21. Case study • Optical character recognition (OCR) • Preprocessing        Digitization, make binary        Noise elimination, thinning, normalizing Feature Extraction (by character, word part, word)        Segmentation (explicit or implicit)        Detection of major features (top-down approach) • Matching        Recognition of character        Context verification from knowledge base • Understanding and Action • See the reference

  22. Example

  23. 3. Optimum statistical classifiers

  24. Bayes classifer for Gaussian pattern class Consider two patter classes with Gaussian distribution

  25. N-dimensional case

  26. Example

  27. A real example

  28. Linear classifier • Two classes • f(x) is a separation hyperplane • How to obtain the coefficients, or weights wi • By perceptron algorithm

  29. How to obtain the coefficients, or weights wi

  30. The Online Form of the Perceptron Algorithm

  31. The Multiclass LS Classifier The classification rule is now as follows: Given x, classify it to class ωi if

More Related