1 / 16

AUTOWISARD: Unsupervised Modes for the WISARD

AUTOWISARD: Unsupervised Modes for the WISARD. Authors: Iuri Wickert (iwickert@yahoo.com) Felipe M. G. França (felipe@cos.ufrj.br) Computer Systems Engineering Program - COPPE Federal University of Rio de Janeiro - Brazil. Presentation Structure. Introduction Brief intro about the WISARD

shepry
Download Presentation

AUTOWISARD: Unsupervised Modes for the WISARD

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. AUTOWISARD: Unsupervised Modes for the WISARD Authors: Iuri Wickert (iwickert@yahoo.com) Felipe M. G. França (felipe@cos.ufrj.br) Computer Systems Engineering Program - COPPE Federal University of Rio de Janeiro - Brazil

  2. Presentation Structure • Introduction • Brief intro about the WISARD • Standard AUTOWISARD • Hierarchical AUTOWISARD • Illustrative Test • Conclusions

  3. Introduction • AUTOWISARD is a pair of unsupervised training algorithms for a standard, unmodified WISARD weightless neural network. • It consists of a “flat” algorithm and a hierarchical, recursive composition of the former one. • The base algorithm is able to learn unsorted input samples, reaching stabilization within a single pass. • The hierarchical version specializes classes which present excessive generalization.

  4. Introduction (cont’d) • Motivation: to unite the unsupervised learning properties of the ART1 model with the simplicity and good recognition power of the base WISARD model, without stepping outside it. Absence of unsupervised learning algorithms for that model. • Related work: WIS-ART (Fulcher, 1992), a hybrid WISARD and ART1 network. • AUTOWISARD = AUTOmatic WISARD ...

  5. The WISARD Neural Network • The WISARD (Alexander et al, 1985) is a classical weightless neural network. • This kind of network stores all its knowledge inside simple, n-bit address RAM memory neurons, filled with zeros. • Its training consists on writing 1’s on the positions addressed by its inputs; recognition is retrieving the value on the position addressed by.

  6. The WISARD (con’t) • An array of RAM neurons forms a class-like recognition device, a discriminator. • In a discriminator, each neuron sees only a subset of the input sample, addressed via an input-neuron mapping. • The output of each neuron is summed up to represent the discriminator’s recognition for that sample

  7. The WISARD (con’t) • A WISARD network is an array of discriminators working in parallel, each one acessing the whole input pattern. • A winner function determines which discriminator has the best recognition for that sample, if any.

  8. The AUTOWISARD model • The Standard AUTOWISARD controls the creation of new classes (discriminators) and the training of existing ones in a WISARD. • It reaches stability within a single-pass training, not relying on specific sample orders or input-neuron mappings.

  9. The AUTOWISARD (con´t) • It is centered on the learning window policy. Given r_best as the best net recognition for a sample: • 0 <= r_best <= w_min: creates a new class; • w_min < r_best < w_max: creates or trains a class; • w_max <= r_best: do nothing

  10. The AUTOWISARD (con´t) • To minimize saturation of the network, it uses a partial learning, when a discriminator learns just enough of an input sample to successfully recognize it. • A probabilistic function controls the actions inside the learning window, to increase network robustness. • The learning window policy, partial training, probabilistic class training/instantiation, together with the monotonic recognition function of the discriminator ensures that any trained sample will be recognised with a minimum value of w_max (thus reaching stability).

  11. The Hierarchical AUTOWISARD • A recursive application of the AUTOWISARD, over discriminators which seems to recognize more than one data cluster, creating also a hierarchical relashionship between classes and its sub-classes. • This situation happens when the recognition interval of a given discriminator is vast, above a threshold:

  12. The Hierarchical AUTOWISARD (con´t) • A sample hierarchical AUTOWISARD:

  13. Illustrative Test • To show the classification skills of the Standard AUTOWISARD model, a sample handwritten character recognition application was developed. • The training set consisted of 1924 labeled images of 0-9 digits (evenly distributed). 20 runs of the AUTOWISARD were made; each one was characterized by number of classes generated, classes containing multiple symbols (and its winners´ average recognition) and classes containing less than 1% of the training set. • AUTOWISARD´s sensitivity to variation on the training parameters was not analysed.

  14. Illustrative Test (con´t) • The first 10 runs:

  15. Illustrative Test (con´t) • Even training the network with randomly ordered samples and randomly generated input mappings, the runs seemed to converge to a same cluster configuration. • The classes that encompasses more than one symbol still have a clear winner, indicating that they didn´t fall into saturation.

  16. Conclusions • AUTOWISARD is a simple, yet powerful learning extension to the classic WISARD model, but without adding up to its architecture. It means that its output is compatible with the many WISARD hardware implementations. • With AUTOWISARD, new knowledge is acquired on-the-fly, without need for training loops and for disturbing already consolidated knowledge. • Its learning mechanisms (and its multivector class representation) provide the creation of rather complex separation surfaces between classes.

More Related