230 likes | 342 Views
This document explores the concept of SANN (Subsequent Artificial Neural Networks) and outlines its advantages over traditional ANN approaches. Key topics include terminology, clustering methods, dimensionality reduction, and various training techniques such as backpropagation. The effectiveness of SANN in classifying complex datasets, including detailed comparisons with All Pairs methods, is evaluated. We discuss tools for implementation, highlight results demonstrating improved accuracy and speed, and provide insights into the impacts of hyperplanes and inputs on the neural network's learning processes.
E N D
… SANN … Classification … Classification with Subsequent Artificial Neural Networks Linder, Dew, Sudhoff, Theegarten, Remberger, PÖppl, Wagner Brian Selinsky
Outline • Terminology • SANN vs Other ANN approaches • SANN vs All Pairs • Results
Clustering Dimensionality NN vs. ANN Neuron Synapse Thresholds Weights Training Learning Rate Backpropagation Input Standardization Normalization Hidden Layers Approaches MLP One vs All All Pairs SANN Tools Terminology
Clustering O X O X O O X X O X O O X X O X O X O X
Clustering O X O Y X O Y O X X O Y X Y Y O O X X O Y Y X O X Y Y O X Y
Clustering O X O Y X O Y O X X O Y X Y Y O O X X O Y Y X O X Y Y O X Y
Dimensionality • Inputs of interest • Hyperplanes • Dr Frisinas’ data • 4 Clusters • 22690 Inputs of interest • 22690 Dimensional Data • 3 or 4 - 22689 Dimensional Hyperplanes
Dimensionality • Neural Nets convert input dimensionality to 1 number! 1 0 -1
ANN vs. NN • Semantics • Artificial Neural Net • Meant to simulate how the brain functions • Brain is a network of neurons • Brain is the natural neural net • I use NN
Neural Net Class 1 Neural Net Black Box (Some magic happens here) Class 2 Class 3
Neuron Neural Net Class 1 Calculate Summation Compare to Threshold Class 2 Class 3
C1 C T C2 C3 Neural Network N Neural Net N N N N N N
C1 C T C2 C3 Neural Network Inputs & Processing N Neural Net N N N N N N Learning
C1 C T C2 C3 Training Inputs & Processing N Neural Net Training Set N N N N N N Learning
What gets trained • Threshold • Categorization • Weight • Impact of an input to a neuron • Proportionality • Learning Rate • Effect on weights • Effect on speed of training
C1 C T C2 C3 How? - Backpropagation Inputs & Processing N Neural Net Training Set N N N N N N Learning
Input Data • Data 1 Range 12000 – 500000 • Data 2 Range 1.0 – 1.5 • Standardizing or normalizing data makes weights more consistent and more accurate
Approaches • Multi-layer Perceptron (MLP) • Subdividing the problem • One vs. All • All Pairs • SANN
One vs. All Class A ANN Not Class A ANN Class B Not Class B Class C ANN Not Class C
All Pairs Class A ANN Class A Class B Class C Class A ANN Class C Class B Class B ANN Class C
SANN ANN A vs B Class A .12 ANN Class B .88 Class C .91 ANN A vs C Class B .90 ANN B vs C Final Values Class A .12 Class B .90 Class C .89 Class C .89
Results • Increased data & nodes • Increased noise • Subdividing NNs increases accuracy • All Pairs vs SANN • All Pairs more accurate • SANN faster
Tools (FYI) • MatLab (Neural Network Toolbox) • On CS System (Unix and Windows) • NeuroSolutions • 60 day free trial (Windows) • Joone • Free (Platform Independent)