1 / 20

Artificial Intelligence 9. Perceptron

Artificial Intelligence 9. Perceptron. Japan Advanced Institute of Science and Technology (JAIST) Yoshimasa Tsuruoka. Outline. Feature space Perceptrons The averaged perceptron Lecture slides http://www.jaist.ac.jp/~tsuruoka/lectures/. Feature space.

bpowell
Download Presentation

Artificial Intelligence 9. Perceptron

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Artificial Intelligence9. Perceptron Japan Advanced Institute of Science and Technology (JAIST) Yoshimasa Tsuruoka

  2. Outline • Feature space • Perceptrons • The averaged perceptron • Lecture slides • http://www.jaist.ac.jp/~tsuruoka/lectures/

  3. Feature space • Instances are represented by vectors in a feature space

  4. Feature space • Instances are represented by vectors in a feature space 正例 <Outlook = sunny, Temperature = cool, Humidity = normal> 負例 <Outlook = rain, Temperature = high, Humidity = high>

  5. Separating instances with a hyperplane • Find a hyperplane that separates the positive and negative examples

  6. Perceptron learning • Can always find such a hyperplane if the given examples are linearly separable

  7. Linear classification • Binary classification with a linear model : instance : feature vector : weight vector bias If the inner product of the feature vector with the linear weights is greater than or equal to zero, then it is classified as a positive example, otherwise it is classified as a negative example

  8. The Perceptron learning algorithm • Initialize the weight vector • Choose an example (randomly) from the training data • If it is not classified correctly, • If it is a positive example • If it is a negative example • Step 2 and 3 are repeated until all examples are correctly classified.

  9. Learning the concept OR • Training data Negative Positive Positive Positive

  10. Iteration 1 • x1 Wrong!

  11. Iteration 2 • x4 Wrong!

  12. Iteration 3 • x2 OK!

  13. Iteration 4 • x3 OK!

  14. Iteration 5 • x1 Wrong!

  15. Separating hyperplane • Final weight vector t 1 Separating hyperplane s 1 s and t are the input (the second and the third elements of the feature vector)

  16. Why the update rule works • When a positive example has not been correctly classified This values was too small Original value This is always positive The update rule makes it less likely for the perceptron to make the same mistake

  17. Convergence • The Perceptron training algorithm converges after a finite number of iterations to a hyperplane that perfectly classifies the training data, provided the training examples are linearly separable. • The number of iterations can be very large • The algorithm does not converge if the training data are not linearly separable

  18. Learning the PlayTennis concept Final weight vector • Feature space • 11 binary features • Perceptron learning • Converged in 239 steps

  19. Averaged Perceptron • A variant of the Perceptron learning algorithm • Output the weight vector which is averaged over iterations rather than the final weight vector • Do not wait until convergence • Determine when to stop by observing the performance on the validation set • Practical and widely used

  20. Naive Bayes vs Perceptrons • The naive Bayes model assumes conditional independence between features • Adding informative features does not necessarily improve the performance • Percetrons allow one to incorporate diverse types of features • The training takes longer

More Related