1 / 8

Instance-Based Learners

Instance-Based Learners. So far, the learning methods that we have seen all create a model based on a given representation and a training set. Once the model has been created the training set no longer used in classifying unseen instances.

amory
Download Presentation

Instance-Based Learners

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Instance-Based Learners • So far, the learning methods that we have seen all create a model based on a given representation and a training set. • Once the model has been created the training set no longer used in classifying unseen instances. • Instance-Based Learners are different in that they do not have a training phase. They also do not create a model. • Instead, they use the training set each time an instance must be calculated.

  2. Instance-Based Learners • Although there are a number of types of instance-based approach, two simple (but effective) methods are: • K-Nearest Neighbor • Discrete Target Functions • Continuous Target Functions • Distance Weighted • General Regression Neural Networks

  3. Instance-Based Learners:K-Nearest Neighbor (Discrete) • Given a training set of the form {(t1,d1), (t2,d2), … , (tn,dn)} • Let tq represent an instance to be classified as dq • Let Neighborhood = {(tc[1], dc[1]), (tc[2], dc[2]), … , (tc[k], dc[k])}, represent the set of k training instances closest to instance q. Where c is an array of the indexes of the closest instances to q using a distance function d(q,i), that returns the distance between tq and ti. • Simply set dq = the most common di in Neighborhood.

  4. Instance-Based Learners:K-Nearest Neighbor (Continuous) • Given a training set of the form {(t1,d1), (t2,d2), … , (tn,dn)} • Let tq represent an instance to be classified as dq • Let Neighborhood = {(tc[1], dc[1]), (tc[2], dc[2]), … , (tc[k], dc[k])}, represent the set of k training instances closest to instance q. Where c is an array of the indexes of the closest instances to q using a distance function d(q,i), that returns the distance between tq and ti. • Simply set dq = (Σi dc[i])/k .

  5. Instance-Based Learners:K-Nearest Neighbor (Distance Weighted) • Given a training set of the form {(t1,d1), (t2,d2), … , (tn,dn)} • Let tq represent an instance to be classified as dq • Let Neighborhood = {(tc[1], dc[1]), (tc[2], dc[2]), … , (tc[k], dc[k])}, represent the set of k training instances closest to instance q. Where c is an array of the indexes of the closest instances to q using a distance function d(q,i), that returns the distance between tq and ti. • Let wi = d(q,c[i])-b • Set dq = (Σi=1kwidc[i])/(Σi wi) • k < n (Local Method) • k = n (Global Method [Shepard’s Method])

  6. Instance-Based Methods:General Regression Neural Networks (GRNNs) • GRNNs are global methods that consist of: • A hidden layer of Gaussian neurons (one neuron for each ti) • A set of weights wi, where wi = di • A set of standard deviations, σi for each training instance i • dq= f(tq) = (Σhfi(tq,ti)di) / Σhfi(tq,ti) • hfi(tq,ti) = exp(- (||tq- ti||2)/2σi2)

  7. Instance-Based Learning:General Regression Neural Networks (GRNNs)

  8. Instance-Based Learning:General Regression Neural Networks (GRNNs)

More Related