Learning with Neural Networks. Artificial Intelligence CMSC 25000 February 19, 2002. Agenda. Neural Networks: Biological analogy Review: singlelayer perceptrons Perceptron: Pros & Cons Neural Networks: Multilayer perceptrons Neural net training: Backpropagation Strengths & Limitations
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
Learning with Neural Networks
Artificial Intelligence
CMSC 25000
February 19, 2002
Dendrites
Axon
Nucleus
Cell Body
Neurons: Receive inputs from other neurons (via synapses)
When input exceeds threshold, “fires”
Sends output along axon to other neurons
Brain: 10^11 neurons, 10^16 synapses
Single neuronlike element
Binary inputs &output
Weighted sum of inputs > threshold
y
w0
wn
w1
w3
w2
x0=1
x1
x2
x3
xn
. . .
compensates for threshold
x0 w0
x2
0
0
0
0
+ +++ + +
0
0
0
x1
x2
+
0
But not
0
+
x1
xor
X1
Y1
Y2
X2
X3
X4
Inputs
Hidden
Hidden
Outputs
o1
w11
Network
Topology:
2 hidden nodes
1 output
w13
x1
w01
w21
y
1
w23
w12
w03
w22
x2
1
w02
o2
Desired behavior:
x1 x2 o1 o2 y
0 0 0 0 0
1 0 0 1 1
0 1 0 1 1
1 1 1 1 0
1
Weights:
w11= w12=1
w21=w22 = 1
w01=3/2; w02=1/2; w03=1/2
w13=1; w23=1
z1
z2
z3
y3
z3
w03
1
w23
w13
y1
y2
z2
z1
w21
w01
w22
w02
w11
1
w12
1
x2
x1
xi : ith sample input vector
w : weight vector
yi*: desired output for ith sample

Sum of squares error over training samples
From 6.034 notes lozanoperez
Full expression of output in terms of input and weights
E = G(w)
Error as function of weights
Find rate of change of error
Follow steepest rate of change
Change weights s.t. error is minimized
dG
dw
E
G(w)
w0w1
w
Local
minima
z1
z2
z3
y3
z3
w03
1
w23
w13
y1
y2
z2
z1
w21
w01
w22
w02
w11
1
w12
1
x2
x1

Note: Derivative of sigmoid:
ds(z1) = s(z1)(1s(z1))
dz1
From 6.034 notes lozanoperez
MIT AI lecture notes, LozanoPerez 2000
i
j
k
y3
z3
w03
1
w13
y1
w23
y2
z2
z1
w21
w01
w22
w02
1
w11
w12
1
x2
x1
Forward prop: Compute zi and yi given xk, wl
From 6.034 notes lozanoperez