1 / 35

Pattern Recognition: Statistical and Neural

Nanjing University of Science & Technology. Pattern Recognition: Statistical and Neural. Lonnie C. Ludeman Lecture 18 Oct 21, 2005. Lecture 18 Topics. 1. Example – Generalized Linear Discriminant Function 2. Weight Space 3. Potential Function Approach- 2 class case

kbaird
Download Presentation

Pattern Recognition: Statistical and Neural

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Nanjing University of Science & Technology Pattern Recognition:Statistical and Neural Lonnie C. Ludeman Lecture 18 Oct 21, 2005

  2. Lecture 18 Topics 1. Example – Generalized Linear Discriminant Function 2. Weight Space 3. Potential Function Approach- 2 class case 4. Potential Function Example- 2 class case 5. Potential Function Algorithm – M class case

  3. Classes not Linearly separable from C1 from C2 2 1 3 4 x1 1 2 -1 -2 Q. How can we find decision boundaries?? Answers: (1) Use Generalized Linear Discriminant functions (2) Use Nonlinear Discriminant Functions

  4. Example: Generalized Linear Discriminant Functions x2 from C1 from C2 3 2 1 3 4 x1 1 2 -1 -2 Given Samples from 2 Classes

  5. Find a generalized linear discriminant function that separates the classes Solution: d(x) = w1f1(x)+ w2f2(x)+ w3f3(x) + w4f4(x) +w5f5(x) + w6f6(x) = wT f(x) in the f space (linear)

  6. where in the original pattern space: (nonlinear)

  7. Use the Perceptron Algorithm in the f space (the iterations follow) Iteration # Samples Action Weights

  8. d(x) Iterations Continue Iterations Stop

  9. The discriminant function is as follows Decision boundary set d(x) = 0 Putting in standard form we get the decision boundary as the following ellipse

  10. Decision Boundary in original pattern space x2 from C1 2 from C2 1 3 4 x1 1 2 -1 Boundary d(x) = 0 -2

  11. Weight Space To separate two pattern classes C1 and C2 by a hyperplane we must satisfy the following conditions Where wTx = 0 specifies the boundary between the classes

  12. But we know that wTx = xTw Thus we could now write the equations in the w space with coefficients representing the samples as follows Each inequality gives a hyperplane boundary in the weight space such that weights on the positive side would satisfy the inequality

  13. In the Weight Space

  14. View of the Pereptron algorithm in the weight space

  15. Potential Function Approach – Motivated by electromagnetic theory + from C1 - from C2 Sample space

  16. Given Samples x from two classes C1 and C2 C C S1 S2 C1 C2 Define Total Potential Function K(x) = ∑ K(x, xk) - ∑ K(x, xk) xk S1 xk S2 Potential Function Decision Boundary K(x) = 0

  17. Choices for Potential functions K(x, xk)

  18. Graphs of Potential functions

  19. Example – Using Potential functions Given the following Patterns from two classes Find a nonlinear Discriminant function using potential functions that separate the classes

  20. Plot of Samples from the two classes

  21. Trace of Iterations

  22. Algorithm converged in 1.75 passes through the data to give final discriminant function as

  23. KFINAL(x)

  24. x1

  25. Potential Function Algorithm for K Classes Reference (3) Tou And Gonzales

  26. Flow Chart for Potential Function Method: M-Class

  27. Flow Chart Continued

  28. Flow Chart Continued

  29. Summary 1. Example – Generalized Linear Discriminant Function 2. Weight Space 3. Potential Function Approach- 2 class case 4. Potential Function Example- 2 class case 5. Potential Function Algorithm – M class case

  30. End of Lecture 18

More Related