1 / 16

K-Nearest Neighbor Learning

K-Nearest Neighbor Learning. Dipanjan Chakraborty. Different Learning Methods. Eager Learning Explicit description of target function on the whole training set Instance-based Learning Learning=storing all training instances Classification=assigning target function to a new instance

miketroy
Download Presentation

K-Nearest Neighbor Learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. K-Nearest Neighbor Learning Dipanjan Chakraborty

  2. Different Learning Methods • Eager Learning • Explicit description of target function on the whole training set • Instance-based Learning • Learning=storing all training instances • Classification=assigning target function to a new instance • Referred to as “Lazy” learning

  3. Different Learning Methods • Eager Learning Any random movement =>It’s a mouse I saw a mouse!

  4. Instance-based Learning Its very similar to a Desktop!!

  5. Instance-based Learning • K-Nearest Neighbor Algorithm • Weighted Regression • Case-based reasoning

  6. K-Nearest Neighbor • Features • All instances correspond to points in an n-dimensional Euclidean space • Classification is delayed till a new instance arrives • Classification done by comparing feature vectors of the different points • Target function may be discrete or real-valued

  7. 1-Nearest Neighbor

  8. 3-Nearest Neighbor

  9. K-Nearest Neighbor • An arbitrary instance is represented by (a1(x), a2(x), a3(x),.., an(x)) • ai(x) denotes features • Euclidean distance between two instances d(xi, xj)=sqrt (sum for r=1 to n (ar(xi) - ar(xj))2) • Continuous valued target function • mean value of the k nearest training examples

  10. Voronoi Diagram • Decision surface formed by the training examples

  11. Distance-Weighted Nearest Neighbor Algorithm • Assign weights to the neighbors based on their ‘distance’ from the query point • Weight ‘may’ be inverse square of the distances • All training points may influence a particular instance • Shepard’s method

  12. Remarks +Highly effective inductive inference method for noisy training data and complex target functions +Target function for a whole space may be described as a combination of less complex local approximations +Learning is very simple - Classification is time consuming

  13. Remarks - Curse of Dimensionality

  14. Remarks - Curse of Dimensionality

  15. Remarks - Curse of Dimensionality

  16. Remarks • Efficient memory indexing • To retrieve the stored training examples (kd-tree)

More Related