1 / 27

INC 551 Artificial Intelligence

INC 551 Artificial Intelligence. Lecture 11 Machine Learning (Continue). Bayes Classifier. Bayes Rule. Play Tennis Example. John wants to play tennis everyday. However, in some days, the condition is not good. So, he decide not to play.

helene
Download Presentation

INC 551 Artificial Intelligence

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. INC 551 Artificial Intelligence Lecture 11 Machine Learning (Continue)

  2. Bayes Classifier Bayes Rule

  3. Play Tennis Example John wants to play tennis everyday. However, in some days, the condition is not good. So, he decide not to play. The following table is the record for the last 14 days.

  4. Question: Today’s condition is <Sunny, Mild Temperature, Normal Humidity, Strong Wind> Do you think John will play tennis?

  5. Find We need to use naïve Bayes assumption. Assume that all events are independent. Now, let’s look at each property

  6. Using Bayes rule

  7. Since P(condition) is the same, we can conclude that John is more likely to play tennis today. Note that, we do not need to compute P(condition) to get the answer. However, if you want to get the number, we can calculate P(condition) in the way similar to normalize the probability.

  8. Therefore, John is more likely to play tennis today with 58% chance.

  9. Learning and Bayes Classifier Learning is the adjustment of probability values to compute a posterior probability when new data Is added.

  10. Classifying Object Example Suppose we want to classify objects into two classes, A and B. There are two features that we can measure from each object, f1 and f2. We sample four objects randomly to be a database and classify it by hand. Now, we have another sample that have f1=3.2 f2=4.2 we want to know what class it is.

  11. We want to find Using Bayes rule From the table, we will count the number of events.

  12. Find Again, we use the naïve Bayes assumption. Assume that all events are independent. To find we need to assume probability distribution because the features are continuous value. The most common form of distribution is Gaussian (normal) distribution.

  13. Gaussian distribution There are two parameters: mean µ and variance σ Using the maximum likelihood principle, the mean and the variance can be estimated from the samples In the database.

  14. Class A f1: Mean = (2.3+1.5 )/2 = 1.9 SD = 0.4 f2: Mean = (5.4+4.4 )/2 = 4.9 SD = 0.5 Class B f1: Mean = (5.2+4.5 )/2 = 4.85 SD = 0.35 f2: Mean = (1.2+2.1 )/2 = 1.65 SD = 0.45

  15. The object that we want to classify has f1=3.2 f2=4.2.

  16. Therefore, From Bayes Therefore, we should classify the sample as Class A.

  17. Nearest Neighbor Classification NN is considered as no model classification. Nearest Neighbor’s Principle The unknown sample is classified to be the same class as the sample with closet distance.

  18. Feature 2 Closet Distance Feature 1 We classify the sample as a circle.

  19. Distance between Samples Sample X and Y have multi-dimension feature values. The distance between sample X,Y can be calculated by this formula.

  20. If k = 1 , the distance is called Manhattan distance If k = 2 , the distance is called Euclidean distance If k = ∞ , the distance is the maximum value of feature Euclidean is well-known and is the prefer one.

  21. Classifying Object with NN Now, we have another sample, f1=3.2 f2=4.2 We want to know its class.

  22. Compute Euclidian distance from it to all other samples The unknown sample has the closest distance to the second sample. Therefore, we classify it to be the same class as the second sample, which is Class A.

  23. K-Nearest Neighbor (KNN) Instead of using the closet sample as the decided class, we use the closet k samples as the decided class.

  24. Feature 2 Feature 1 Example k=3 The data is classified as a circle

  25. Feature 2 Feature 1 Example k=5 The data is classified as a star.

More Related