1 / 24

Neural Networks

Neural Networks. In Today's Lecture. Pattern Classification Characteristics in Pattern Recognition Perceptron Simple Perceptron for Pattern Classification Perceptron: Algorithm Perceptron: Example Perceptron: Matlab code Assignment 1. Pattern Recognition. What is a Pattern?

tasha
Download Presentation

Neural Networks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Neural Networks

  2. In Today's Lecture ... • Pattern Classification • Characteristics in Pattern Recognition • Perceptron • Simple Perceptron for Pattern Classification • Perceptron: Algorithm • Perceptron: Example • Perceptron: Matlab code • Assignment 1

  3. Pattern Recognition • What is a Pattern? • Patterns can be representations of scenes, objects, temporal sequences, and description of anything physical or abstract. • Visual patterns can be pixels of various colors and intensities. • Speech patterns can be measurements of acoustical waveforms. • Patterns in medical diagnosis may be strings of symptoms detected from patients. • Regardless of the origin of these patterns, they can all be represented as vectors of various dimensions. • These vectors are the input patterns to the pattern recognition systems.

  4. Pattern Recognition • Classes and Classification • The goal of the pattern recognition process is to identify, or classify, input patterns. • Character recognition may involve the task of identifying an image scanned from a written page of text as one of 26 possible characters of the alphabet. • In speech recognition, pattern recognition may involve the discrimination of vectors representing spoken words. • The pattern being recognized is classified as belonging to one of a small number of classes. • For example, the 26 characters may be the classes in character recognition. • The scanned input image is classified as one of the alphabet characters.

  5. Characteristics in Pattern Recognition • Data Space Partitioning • Pattern recognition results in a partitioning and labeling of regions in the input space. • If the input to a pattern recognition system is an N dimensional vector, the input space is a space spanned by all input vectors (each input vector can be viewed as a point, or pattern, in the N dimensional input space). • Let X be the set of all input patterns, and C be the set of classes. • Formally, a pattern recognition system P(X) performs a mapping from the input space to the set of possible classes. P(X) : XgC

  6. A B A C C Estimated Decision Boundary for Class C Points Correctly Classified as Class C Points Misclassified as Class B Characteristics in Pattern Recognition • Data Space Partitioning (Input Data Space Partitioning for two dimensional data)

  7. Characteristics in Pattern Recognition • Adaptivity • Adaptation occurs during training when examples of various classes from a training set are presented to the pattern recognition system along with their correct classification. • The system “learns” from the training process by associating an input example in the training set with the correct class. • After being trained with a class label, the system can often later correctly classify the same pattern and improve on “similar” mistakes. • Thus the performance of a pattern recognition system improves as more training data is represented and it becomes more and more capable of correctly classifying a wider range of inputs.

  8. Characteristics in Pattern Recognition • Generalization • If an adaptive system is trained using a set of patterns, T, and can correctly classify patterns not in T, the process behind this behavior is called generalization. • Generalization based on adaptation is recognized as being the essence of intelligent systems. • Usually, the training set T, available to a pattern recognition system is very small compared to the total number of examples in the input space. • By using a well selected set that captures most of the variations expected from the inputs, a well designed pattern recognition system should be able to “learn” important properties in distinguishing patterns of different classes.

  9. Characteristics in Pattern Recognition • Input Space Complexity • Minimum error boundaries between classes in pattern recognition can be extremely complex and difficult to characterize. • Since the unknown boundaries can be extremely irregular, adaptive systems are unlikely to find the solution through “learning” merely based in a small set of examples.

  10. Perceptron • Perceptrons had perhaps the most far-reaching impact of any of the early neural nets. • A number of different types of Perceptrons have been used and described by various workers. • The original perceptrons had three layers of neurons – sensory units, associator units and a response unit – forming an approximate model of a retina. • Under suitable assumptions, its iterative learning procedure can be proved to converge to the correct weights i.e., the weights that allow the net to produce the correct output value for each of the training input patterns.

  11. Simple Perceptron for Pattern Classification 1 • The architecture of a simple perceptron for performing single classification is shown in the figure. • The goal of the net is to classify each input pattern as belonging, or not belonging, to a particular class. • Belonging is signified by the output unit giving a response of +1; not belonging is indicated by a response of -1. • A zero output means that it is not decisive. X1 b w1 X2 Y2 w2 l l l wn Xn

  12. Perceptron: Algorithm • Initialize weights, bias and the threshold q. Also set the learning rate a such that ( 0 < a <=1). • While stopping condition is false, do the following steps. • For each training pair s:t, do steps 4 to 7. • Set the activations of input units. xi = si • Compute the response of the output unit. • Update weights and bias if an error occurred for this pattern. If y is not equal to t (under some limit) then wi(new) = wi(old) + a xi t for i = 1 to n b(new) = b(old) + a t end if • Test stopping condition: If no weight changed in step 3, stop, else continue.

  13. Perceptron: Example • Lets consider the following training data: Inputs Target x1 x2 t 1 1 1 1 0 -1 0 1 -1 0 0 -1 • We initialize the weights to be w1 = 0, w2 = 0 and b = 0. Also we set a = 1, and q = 0.2. • The following table shows the sequence in which the net is provided with the input one by one and checked for the required target.

  14. Perceptron: Example 1 2 Input Bias Net Input Output Target Weights yin y t x1 x2 w1 w2 b 0 0 0 1 1 1 0 0 1 1 1 1 1 0 1 2 1 -1 0 1 0 0 1 1 1 1 -1 0 0 -1 0 0 1 -1 -1 -1 0 0 -1 1 1 1 -1 -1 1 1 1 0 1 0 1 1 1 -1 0 1 -1 0 1 1 0 0 -1 0 0 -2 0 0 1 -2 -1 -1 0 0 -2

  15. Perceptron: Example Input Bias Net Input Output Target Weights yin y t x1 x2 w1 w2 b 10 1 1 1 1 1 1 2 3 -4 1 0 1 -2 -1 -1 2 3 -4 0 1 1 -1 -1 -1 2 3 -4 0 0 1 -4 -1 -1 2 3 -4 Thus the positive response is given by all the points such that And the negative response is given by all the points such that

  16. Decision Boundaries • We note that the output of the Perceptron is 1(positive), if the net input ‘yin’ is greater than q. • Also that the output is -1(negative), if the net input ‘yin’ is less than –q. • We also know that yin = b + x1*w1 + x2*w2 in this case • Thus we have 2*x1 + 3*x2 – 4 > 0.2 (+ve output) 2*x1 + 3*x2 – 4 < -0.2 (-ve output) • Removing the inequalities and simplifying them gives us the following two equations: 2*x1 + 3*x2 = 4.2 2*x1 + 3*x2 = 3.8

  17. Decision Boundaries • These two equations are equations of straight lines in x1,x2-plane. • These lines geometrically separate out the input training data into two classes. • Hence they serve as acting like decision boundaries. • The two lines have the same slope, but their y-intercepts are different. So essentially, we have two parallel straight lines acting line decision boundaries for out input data. • The parameter q, determines the separation between two lines (and hence the width of the indecisive region).

  18. Perceptron: Example x2 (0, 1) (1, 1) x1 (0, 0) (1, 0) • Graphically Undecided Region

  19. Perceptron: Matlab Code and Assignment 1 • The following slides contain the Matlab source code for the Simple Perceptron Studied in the class. • Also find the first assignment in these slides

  20. %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% % Purpose: % This Matlab code is based on Perceptron learning rule % Part 1: Training data and various control parameters are defined % Part 2: Perceptron learning rule is used for training of the net % Part 3: The resulting network is tested for the input data % Part 4: Some exercises %%%%%%%%%%%%%%%%%%%%%%%%% Part 1 %%%%%%%%%%%%%%%%%% clear all % Initialize various parameters w(1) = 0 % Initialize weights, w(1) is bias weight. w(2) = 0; w(3) = 0 alpha = 1 % Learning Rate theta = 0.2 % Length of non-decisive region stopFlag = -1; % Stoping flag, -1 = do not stop and +1 = stop wtChange = 1; % Weight change Flag, -1 = no change, +1 = change epsilon = 0.001; % Termination Criteria epoch = 0; % Epoch counter % define training patterns noOfTrainingPatterns = 4; s(1,:) = [1 1 1]; t(1,1) = 1; s(2,:) = [1 1 0]; t(2,1) = -1; s(3,:) = [1 0 1]; t(3,1) = -1; s(4,:) = [1 0 0]; t(4,1) = -1;

  21. %%%%%%%%%%%%%%%%%% Part 2 %%%%%%%%%%%%%%%%%% % Step 1: While stopping condition is false, do steps 1 to 6 while stopFlag < 0 % Step 2: For each training pair s:t, do steps 3 to 5 wtChange = -1; for i=1:noOfTrainingPatterns %Step 3: Set activations of each input unit x = s(i,:); outDesired = t(i,:); % Step 4: Compute activation of the output unit % Calculate the net input yin = x*w'; if yin > theta y = 1.; elseif yin < -theta y = -1.; else y = 0.; end

  22. % Step 5: Update biases and weights if abs(outDesired-y) > epsilon w = w + (alpha*outDesired).*x; wtChange = 1.; end epoch, x, yin, y, outDesired, w end if wtChange < 0 stopFlag = 1.; end epoch = epoch + 1; end % of while loop

  23. %%%%%%%%%% Part 3 %%%%%%%%%%%% % Lets pick one pattern at random from the training data. p=fix(1+rand*(noOfTrainingPatterns-1)) x=s(p,:) outDesired=t(p,:) % For testing, we simply give this as input to the trained net % and find out the output. yin = x*w'; if yin > theta y = 1.; elseif yin < -theta y = -1.; else y = 0.; end % print y and the desired output y, outDesired

  24. Assignment Note: Please submit your assignment (both in hard- and soft copy form) by 4:00pm Thursday, January 01st, 2013). Please carry out the following exercises by doing appropriate modification in the above code. Exercise 1: Consider the training data for the Perceptron used above Inputs target x1 x2 x3 bias t 1 1 1 1 1 1 1 0 1 -1 1 0 1 1 -1 0 1 1 1 -1 (i.e. there is one more input neuron now). Verify that iterations converge after 26 epochs for learning rate equal to 1.0 and theta equal to 0.1, and that the converged set of weights and bias is w1=2, w2 =3, w3 =4 and b=-8. Exercise 2: Plot the changes in the separating lines as they occur in Exercise1. Exercise 3: Using the above code, find the weights required to perform the following classification: Vectors (1,1,1,1) and (-1,1,-1,-1) are members of the class (and therefore have target value 1); Vectors (1,1,1,-1) and (1,-1,-1,1) are not members of the class (and have target value -1). Use a learning rate of 1 and starting weights of 0. Using each of the training vectors as input, test the response of the net. Display the number of epochs taken to each the convergence. Also plot the maximum error in each epoch.

More Related