Artificial Neural Networks. The Brain Brain vs. Computers The Perceptron Multilayer networks Some Applications. Artificial Neural Networks. Other terms/names connectionist parallel distributed processing neural computation adaptive networks.. History
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
Artificial Neural Networks
The Brain
Brain vs. Computers
The Perceptron
Multilayer networks
Some Applications
Brain and Machine
The contrast in architecture
Features of the Brain
The biological inspiration
The biological inspiration
The Structure of Neurons
The Structure of Neurons
A neuron has a cell body, a branching input
structure (the dendrIte) and a branching output structure (the axOn)
The Structure of Neurons
The Artificial Neuron (Perceptron)
+1
ao
wj0
wj1
a1
wj2
a2
Sj
f(Sj)
Xj
wjn
an
y1
w1j
w2j
y2
w3j
O
y3
yi
wij
A Simple Model of a Neuron (Perceptron)
y1
w1j
w2j
y2
w3j
f(x)
O
y3
yi
wij
An Artificial Neuron
1 if wi xi >t
Output=
0 otherwise
{
i=0
1 if wixi >t
output=
0 otherwise
{
i=0
AND with a Biased input
1
W1 = 1.5
X
t = 0.0
W2 = 1
W3 = 1
Y
While epoch produces an error
Present network with next inputs from epoch
Error = T – O
If Error <> 0 then
Wj = Wj + LR * Ij * Error
End If
End While
Epoch : Presentation of the entire training set to the neural network. In the case of the AND function an epoch consists of four sets of inputs being presented to the network (i.e. [0,0], [0,1], [1,0], [1,1])
Error: The error value is the amount by which the value output by the network differs from the target value. For example, if we required the network to output 0 and it output a 1, then Error = 1
Target Value, T: When we are training a network we not only present it with the input but also with a value that we require the network to produce. For example, if we present the network with [1,1] for the AND function the target value will be 1
Output , O: The output value from the neuron
Ij : Inputs being presented to the neuron
Wj: Weight from input neuron (Ij) to the output neuron
LR : The learning rate. This dictates how quickly the network converges. It is set by a matter of experimentation. It is typically 0.1
For AND
A B Output
0 0 0
0 1 0
1 0 0
1 1 1
1
W1 = ?
x
t = 0.0
W2 = ?
W3 = ?
y
1
W1 = 0.3
x
t = 0.0
W2 = 0.5
W3 =0.4
y
For AND
A B Output
0 0 0
0 1 0
1 0 0
1 1 1
Decision boundaries
x2
+
+
+


x1
+


x2
+

x1
+

Linearly separable
NonLinearly separable
Linear Separability
X1
A
A
A
Decision
Boundary
B
A
B
A
B
B
A
B
B
A
B
X2
B
Rugby ?
2
Height (m)
Ballet?
1
50
120
Weight (Kg)
Hyperplane partitions
Hyperplane partitions
A
B
B
A
B
A
A
B
B
A
B
A
A
B
B
A
B
A
Different NonLinearlySeparable Problems
Types of
Decision Regions
ExclusiveOR
Problem
Classes with
Meshed regions
Most General
Region Shapes
Structure
SingleLayer
Half Plane
Bounded By
Hyperplane
TwoLayer
Convex Open
Or
Closed Regions
Arbitrary
(Complexity
Limited by No.
of Nodes)
ThreeLayer
Output Values
Input Signals (External Stimuli)
Multilayer Perceptron (MLP)
Output Layer
Adjustable
Weights
Input Layer
Types of Layers
Activation functions
1
f(x) =
1 + e ax
Standard activation functions
Training Algorithms
BackPropagation
Applications
Engine management
Drives 70 mph on a public highway
30 outputs
for steering
30x32 weights
into one out of
four hidden
unit
4 hidden
units
30x32 pixels
as inputs
Signature recognition
Sonar target recognition
Stock market prediction
Mortgage assessment
Neural Network Problems
Parameter setting
Overfitting
Training time