1 / 26

GEOMETRY IN PERCEPTRON LEARNING

GEOMETRY IN PERCEPTRON LEARNING. Reference: GEOMETRY IN LEARNING, tech. report by KRISTIN P. BENNETT AND ERIN J. BREDENSTEINER, RENSSELAER POLYTECHNIC INSTITUTE IE 5970 SPRING, 2,000. PRESENTATION OUTLINE. 1. INTRODUCTION 2. A SIMPLE LEARNING MODEL: CONCEPT OF A PERCEPTRON

novia
Download Presentation

GEOMETRY IN PERCEPTRON LEARNING

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. GEOMETRY IN PERCEPTRON LEARNING Reference: GEOMETRY IN LEARNING, tech. report by KRISTIN P. BENNETT AND ERIN J. BREDENSTEINER, RENSSELAER POLYTECHNIC INSTITUTE IE 5970 SPRING, 2,000

  2. PRESENTATION OUTLINE 1. INTRODUCTION 2. A SIMPLE LEARNING MODEL: CONCEPT OF A PERCEPTRON 3. GEOMETRY OF A PERCEPTRON 4. TRAINING: LINEARLY SEPARABLE CASE 4.1. BASIC PROBLEM (PRIMAL-DUAL) 4.2. FIRST SIMPLIFICATION: THE MULTISURFACE METHOD (MSM) 4.2. SECOND SIMPLIFICATION: THE OPTIMAL PLANE 5. TRAINING: LINEARLY INSEPARABLE CASE 5.1. ROBUST LINEAR PROGRAMMING APPROACH (RLP) 5.2. COMBINATIONS OF MSM AND RLP: GENERALIZED OPTIMAL PLANE (GOP) 5.3. COMBINATIONS OF MSM AND RLP: PERTURBED ROBUST LINEAR PROGRAMMING (RLP-P) 6. APPLICATIONS 7. COMPUTATIONAL RESULTS 8. CONCLUSION 9. REMARKS

  3. 1. INTRODUCTION • CLASSIFICATION PROBLEM (TWO CLASSES A, B) • IDENTIFY ELEMENTS OF EACH CLASS (EXAMPLE: CANCER DIAGNOSIS, TUMOR BENIGN OR MALIGNANT?) • EACH ELEMENT OF CLASS A OR B IS DESCRIBED BY A VECTOR (N ATTRIBUTES, EXAMPLE: PATIENT’S AGE, BLOOD PRESSURE, SMOKING HABITS) • TRAINING PHASE: CLASS IS KNOWN, FUNCTION F(X) IS CONSTRUCTED • TESTING PHASE: CLASS IS NOT KNOWN, F(X) CLASSIFIES FUTURE POINTS

  4. 2. A SIMPLE LEARNING MODEL: CONCEPT OF A PERCEPTRON • PERCEPTRON IS TYPE OF CLASSIFICATION FUNCTION (MOTIVATED BIOLOGICALLY) • PERCEPTRON IS STIMULATED BY (n - DIMENSIONAL) INPUT VECTORS (n ATTRIBUTES, CAN BE SEEN AS COORDINATES)

  5. 2. A SIMPLE LEARNING MODEL: CONCEPT OF A PERCEPTRON

  6. 3. GEOMETRY OF A PERCEPTRON GEOMETRIC INTERPRETATION:

  7. 3. GEOMETRY OF A PERCEPTRON

  8. 3. GEOMETRY OF A PERCEPTRON

  9. 3. GEOMETRY OF A PERCEPTRON

  10. 3. GEOMETRY OF A PERCEPTRON

  11. 4. TRAINING: LINEAR SEPARABLE CASE 4.1. BASIC PROBLEM (PRIMAL-DUAL) PRIMAL PROBLEM:

  12. 4. TRAINING: LINEAR SEPARABLE CASE 4.1. BASIC PROBLEM (PRIMAL-DUAL) GRAPHICAL INTERPRETATION OF THE PRIMAL PROBLEM:

  13. 4. TRAINING: LINEAR SEPARABLE CASE 4.1. BASIC PROBLEM (PRIMAL-DUAL) DUAL PROBLEM:

  14. 4. TRAINING: LINEAR SEPARABLE CASE 4.1. BASIC PROBLEM (PRIMAL-DUAL) GRAPHICAL INTERPRETATION OF THE DUAL PROBLEM:

  15. 4. TRAINING: LINEAR SEPARABLE CASE 4.2. FIRST SIMPLIFICATION: THE MULTISURFACE METHOD (MSM)

  16. 4. TRAINING: LINEAR SEPARABLE CASE 4.2. FIRST SIMPLIFICATION: THE MULTISURFACE METHOD (MSM)

  17. 4. TRAINING: LINEAR SEPARABLE CASE 4.2. SECOND SIMPLIFICATION: THE OPTIMAL PLANE

  18. 5. TRAINING: LINEARLY INSEPARABLE CASE • General approach: minimize misclassification error • Start from the Multisurface Method (MSM)

  19. 5. TRAINING: LINEARLY INSEPARABLE CASE • 5.1. ROBUST LINEAR PROGRAMMING APPROACH (RLP) • minimize the sum of the misclassification errors

  20. 5. TRAINING: LINEARLY INSEPARABLE CASE 5.1. ROBUST LINEAR PROGRAMMING APPROACH (RLP)

  21. 5. TRAINING: LINEARLY INSEPARABLE CASE 5.2. COMBINATIONS OF MSM AND RLP: GENERALIZED OPTIMAL PLANE (GOP)

  22. 5. TRAINING: LINEARLY INSEPARABLE CASE 5.3. COMBINATIONS OF MSM AND RLP: PERTURBED ROBUST LINEAR PROGRAMMING (RLP-P)

  23. 6. APPLICATIONS • Determination of heart diseases • Diagnosis of breast cancer • Voting patterns of congressmen to determine party affiliation • Using sonar signals to distinguish between mines and rocks

  24. 7. COMPUTATIONAL RESULTS USING MINOS (The sonar data set is completely separable.)

  25. 8.CONCLUSION • Perceptron classifies points from two sets • Correct classification only possible in the separable case • Optimization models for training a perceptron were presented • Experiments showed best performance for Generalized Optimal Plane (GOP) and Perturbed Robust Linear Programming (RLP-P)

  26. 9. REMARKS • Interesting connection between learning processes and geometry • Limitation to two classes • For RLP-P and GOP the right selection of the parameter lambda • is very important

More Related