1 / 44

Pattern Recognition: Statistical and Neural

Nanjing University of Science & Technology. Pattern Recognition: Statistical and Neural. Lonnie C. Ludeman Lecture 21 Oct 28, 2005. Lecture 21 Topics. Example – Analysis of simple Neural Network Example - Synthesis of special forms of Artificial Neural Networks

bobbydavis
Download Presentation

Pattern Recognition: Statistical and Neural

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Nanjing University of Science & Technology Pattern Recognition:Statistical and Neural Lonnie C. Ludeman Lecture 21 Oct 28, 2005

  2. Lecture 21 Topics • Example – Analysis of simple Neural Network • Example - Synthesis of special forms of Artificial Neural Networks • 3. General concepts of Training an Artificial Neural Network- Supervised and unsupervised,training sets • 4. Neural Networks Nomenclature and Notation • 5. Derivation and Description of the Backpropagation Algorithm for Feedforward Neural Networks

  3. Example:Analyze the following Neural Network -1 0 1 1 1 0 0 -1 1

  4. Solution: Outputs of layer 1 ANEs

  5. Thus from layer 1 we have Output of layer 2 ANE is - 2 ≥ 0 < 0

  6. Final Solution: Output Function for Given Neural Network

  7. Example:Synthesize a Neural Network Given the following decision regions build a neural network to perform the classification process Solution: Use Hyperplane-AND-OR structure

  8. Eachgk(x)specifies a hyperplane boundary

  9. Solution: Hyperplane Layer AND Layer OR Layer all f(·) =μ(·)

  10. Training a Neural Network “Without a teacher” “With a teacher”

  11. Training Set xj are the training samples dj is the class assigned to training sample xj

  12. Example of a training set: { (x1 = [ 0, 1 ,2 ]T , d1= C1), (x2 = [ 0, 1 ,0 ]T , d2= C1 ), (x3 = [ 0, 1 ,1 ]T , d3= C1 ) , (x4 = [ 1, 0 ,2 ]T , d4= C2 ), (x5 = [ 1, 0 ,3 ]T , d5= C2 ) , (x6 = [ 0, 0 ,1 ]T , d6= C3 ) , (x7 = [ 0, 0 ,2 ]T , d7= C3 ) ( x8 = [ 0, 0 ,3 ]T d8= C3 ) (x9 = [ 0, 0 ,3 ]T d9= C3) (x10 = [ 1, 1 ,0 ]T d10= C4) ( x11 = [ 2, 2 ,0 ]T d11= C4) ( x12 = [ 2, 2 ,2 ]T d12= C5) ( x13 = [ 3, 2, 2 ]T d13= C6) }

  13. General Weight Update Algorithm x(k)is the training sample for the k th iteration d(k) is the class assigned to training sample x(k) y(k)is the output vector for the k th training sample

  14. Training with a Teacher( Supervised) 1. Given a set of N ordered samples with their known class assignments. 2. Randomly select all weights in the neural network. 3. For each successive sample in the total set of samples, evaluate the output. 4. Use these outputs and the input sample to update the weights 5. Stop at some predetermined number of iterations or if given performance measure is satisfied. If not stopped go to step 3

  15. Training without a Teacher( Unsupervised) 1. Given a set of N ordered samples with unknown class assignments. 2. Randomly select all weights in the neural network. 3. For each successive sample in the total set of samples, evaluate the outputs. 4. Using these outputs and the inputs update the weights 5. If weights do not change significantly stop with that result. If weights change return to step 3

  16. Supervised Training of a Feedforward Neural Network Nomenclature

  17. Output vector of layer L Output vector of layer m 1 Node Number Layer m Node Number Layer L

  18. Weight Matrix for layer m N Nm Node Nm Node 2 Node 1

  19. Layers, Nets, Outputs, Nonlinearities fix

  20. Define the performanceEpfor samplex(p)as We wish to select weights so that Ep is Minimized – Use Gradient Algorithm

  21. Gradient Algorithm for Updating the weights p p w(p) x(p)

  22. Derivation of weight update equation for Last Layer (Rule #1) Backpropagation Algorihm The partial of ym(L) with respect to wkj(L) is

  23. General Rule #1 for Weight Update Therefore

  24. Derivation of weight update equation for Next to Last Layer (L-1) Backpropagation Algorithm

  25. General Rule #2 for Weight Update- Layer L-1 Backpropagation Algorithm Therefore and the weight correction is as follows

  26. where weight correction (general Rule #2) is w (L-1)

  27. Backpropagation Training Algorithm for Feedforward Neural networks

  28. Input pattern sample xk

  29. Calculate Outputs First Layer

  30. Calculate Outputs Second Layer

  31. Calculate Outputs Last Layer

  32. Check Performance Single Sample Error Over all Samples Error Ns - 1 ETOTAL(p)  ½(d[x(p-i)] – f( wT(p-i)x(p-i) )2 i = 0 Can be computed recursively ETOTAL(p+1) = ETOTAL(p) + Ep+1(p+1) – Ep-Ns(p-Ns )

  33. Change Weights Last Layer using Rule #1

  34. Change Weights previous Layer using Rule #2

  35. Change Weights previous Layer using Modified Rule #2

  36. Input pattern sample xk+1 Continue Iterations Until

  37. Repeat process until performance is satisfied or maximum number of iterations are reached. If performance not satisfied at maximum number of iterations the algorithm stops and NO design is obtained. If performance is satisfied then the current weights and structure provide the required design.

  38. Freeze Weights to get Acceptable Neural Net Design

  39. Backpropagation Algorithm for Training Feedforward Artificial Neural Networks

  40. Summary Lecture 21 • Example – Analysis of simple Neural Network • Example - Synthesis of special forms of Artificial Neural Networks • 3. General concepts of Training an Artificial Neural Network- Supervised and unsupervised,and description of training sets • 4. Neural Networks Nomenclature and Notation • 5. Derivation and Description of the Backpropagation Algorithm for Feedforward Neural Networks

  41. End of Lecture 21

More Related