1 / 12

Learning Algorithm of MLP

Learning Algorithm of MLP. Neural Networks. Multi Layer Perceptrons. f (.). f (.). f (.). Function signal. Error signal. Computations at each neuron j : Neuron output, y j Vector of error gradient, ¶ E / ¶ w ji. Forward propagation. “ Backpropagation Learning Algorithm”.

lihua
Download Presentation

Learning Algorithm of MLP

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Learning Algorithm of MLP Neural Networks Multi Layer Perceptrons f(.) f(.) f(.) Function signal Error signal • Computations at each neuron j: • Neuron output, yj • Vector of error gradient, ¶E/¶wji Forward propagation “Backpropagation Learning Algorithm” Backward propagation

  2. Neural Networks Multi Layer Perceptrons x1(i) yk(i) x2(i) . . . . . . . . . xm(i) Learning Algorithm of MLP Goal: Cost function / performance index: Minimize Weight Modification Rule

  3. Neural Networks Multi Layer Perceptrons x1(i) yk(i) x2(i) . . . . . . . . . xm(i) Learning Algorithm of MLP • Backpropagation Learning Algorithm: • Learning on output neuron • Learning on hidden neurons

  4. Neural Networks Multi Layer Perceptrons . . . . . . Learning Algorithm of MLP Notations: the output of the k-the neuron of the l-th layer,at the i-th time instant the output of the j-the neuron of the l–1-th layer,at the i-th time instant

  5. Neural Networks Multi Layer Perceptrons . . . . . . . . . Back Propagation Learning Algorithm Learning on output neuron Depends on the activation function

  6. Neural Networks Multi Layer Perceptrons . . . . . . . . . Back Propagation Learning Algorithm Learning on hidden neuron

  7. Neural Networks Multi Layer Perceptrons Back Propagation Learning Algorithm Depends on the activation function Depends on the activation function

  8. Neural Networks Multi Layer Perceptrons f(.) f(.) f(.) Back Propagation Learning Algorithm Forwardpropagation • Set the weights • Calculate output Backwardpropagation • Calculate error • Calculate gradient vector • Update the weights

  9. Neural Networks Multi Layer Perceptrons Influential Factors in Learning • Initial weights and bias • Cost function / performance index • Training data and generalization • Network structure • Number of layers • Number of neurons • Interconnections • Learning Methods • Weight modification rule • Variable or fixed learning rate ()

  10. Neural Networks Multi Layer Perceptrons Homework 4 • Write an m-file that will perform the backpropagation learning algorithm for the following neural network with 2inputs, 1 hidden layer of 2 neurons, and 1 output layer of 1 neuron, no bias at all (all a = 1). • Be sure to obtain decreasing errors. • Submit the hardcopy and softcopy of the m-file. • Hint: The number of parameters to be trained is six.

  11. Neural Networks Multi Layer Perceptrons Homework 4A • Write an m-file that will perform the backpropagation learning algorithm for the following neural network with 2 inputs, 1 hidden layer of3neurons, and 1 output layer of 1 neuron, no bias at all (all a = 1). • Be sure to obtain decreasing errors (convergence). • Submit the hardcopy and softcopy of the m-file. • Hint: The number of parameters to be trained is seven.

More Related