Introduction to Business Analytics. Chapter 6: Neural Networks for Data Mining Matthew J. Liberatore Thomas Coghlan Fall 2008. Learning Objectives. Understand the concept and different types of artificial neural networks (ANN) Learn the advantages and limitations of ANN
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
Chapter 6: Neural Networks for Data Mining
Matthew J. Liberatore
Cells (processing elements) of a biological or artificial neural network
The central processing portion of a neuron
The part of a biological neuron that provides inputs to the cell
An outgoing connection (i.e., terminal) from a biological neuron
The connection (where the weights are) between processing elements in a neural network
The best-known learning algorithm in neural computing. Learning is done by comparing computed outputs to desired outputs of historical cases
Network information processing
The training procedure used by an artificial neural network
A method of training artificial neural networks in which sample cases are shown to the network as input and the weights are adjusted to minimize the error in its outputs
The best-known supervised learning algorithm in neural computing. Learning is done by comparing computed outputs to desired outputs of historical cases
At each hidden node and target node: compute:
Linear combination function: C = w0 + w1x1 +…+ wnxn
Logistic activation function: L = exp(C)/(1+exp(C)
At the target node compute Bernoulli error function: sum errors over all observations, where the error is -2 ln (L) if there is a response, or -2 ln (1 – L) if there is no response
In the first iteration, random weights are used
In subsequent iterations, the weights are changed by a small amount so that the error is reduced
The process continues until the weights cannot be reduced further
Comparing test results to actual results
In the Property Panel window, click on the square to the right of network and change the defaults for the Target Layer Combination, Activation, and Error functions as indicated. Note that we are using the default of 3 hidden units (nodes).
The results show an excellent fit with the cumulative lift equal to the best cumulative lift, no misclassifications, and an average error nearly zero.
In the Property Panel click on the box to the right of Exported Data to see the individual predictions and probabilities. The logistic activation function at the target level provides the probabilities, like those obtained from logistic regression
For details about the individual predictions, highlight the Score node and on the left-hand panel click on the square to the right of Exported Data. Then in the box that appears click on the row whose Port entry is Score. Then click on Explore.