1 / 13

Artificial Neural Networks 0909.560.01/0909.454.01 Fall 2004

Artificial Neural Networks 0909.560.01/0909.454.01 Fall 2004. Lecture 6 October 18, 2004. Shreekanth Mandayam ECE Department Rowan University http://engineering.rowan.edu/~shreek/spring04/ann/. Plan. Radial Basis Function Networks RBF Formulation Network Implementation

ciaran-lang
Download Presentation

Artificial Neural Networks 0909.560.01/0909.454.01 Fall 2004

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Artificial Neural Networks0909.560.01/0909.454.01Fall 2004 Lecture 6October 18, 2004 Shreekanth Mandayam ECE Department Rowan University http://engineering.rowan.edu/~shreek/spring04/ann/

  2. Plan • Radial Basis Function Networks • RBF Formulation • Network Implementation • Matlab Implementation • Design Issues • Center Selection: K-means Clustering Algorithm • Input data processing • Selection of training and test data - cross-validation • Pre-processing: Feature Extraction • Lab Project 3

  3. RBF Principle Transform to “higher”-dimensional vector space Non-linearly separable classes Linearly separable classes

  4. j2(x) x2 j1(x) x1 Example: X-OR Problem Decision Boundary

  5. RBF Formulation Problem Statement • Given a set of N distinct real data vectors (xj; j=1,2,…,N) and a set of N real numbers (dj; j=1,2,…,N), find a function that satisfies the interpolating condition F(xj) = dj; j=1,2,…,N

  6. j 1 j 1 1 1 j 1 j 1 0.5 0 -5 5 RBF Network Hidden Layer Input Layer Output Layer x1 y1 Outputs x2 Inputs y2 x3 wij 1 j(t) t

  7. Matlab Implementation %Radial Basis Function Network %S. Mandayam/ECE Dept./Rowan University %Neural Nets/Fall 04 clear;close all; %generate training data (input and target) p = [0:0.25:4]; t = sin(p*pi); %Define and train RBF Network net = newrb(p,t); plot(p,t,'*r');hold; %generate test data p1 = [0:0.1:4]; %test network y = sim(net,p1); plot(p1,y,'ob'); legend('Training','Test'); xlabel('input, p'); ylabel('target, t') Matlab Demos » demorb1 » demorb3 » demorb4

  8. x2 x1 Centers Data points RBF - Center Selection

  9. K-means Clustering Algorithm • N data points, xi; i = 1, 2, …, N • At time-index, n, define K clusters with cluster centers cj(n); j = 1, 2, …, K • Initialization: At n=0, let cj(n)= xj; j = 1, 2, …, K(i.e. choose the first K data points as cluster centers) • Compute the Euclidean distance of each data point from the cluster center, d(xj , cj(n)) = dij • Assign xj to cluster cj(n)if dij = mini,j {dij}; i = 1, 2, …, N, j = 1, 2, …, K • For each cluster j = 1, 2, …, K, update the cluster center cj(n+1)= mean {xjcj(n)} • Repeat until ||cj(n+1)- cj(n)||< e

  10. Train Train Train Test Train Train Test Train Train Test Train Train Test Train Train Train Selection of Training and Test Data: Method of Cross-Validation • Vary network parameters until total mean squared error is minimum for all trials • Find network with the least mean squared output error Trial 1 Trial 2 Trial 3 Trial 4

  11. Feature Extraction Objective: • Increase information content • Decrease vector length • Parametric invariance • Invariance by structure • Invariance by training • Invariance by transformation

  12. Lab Project 3: Radial Basis Function Neural Networks http://engineering.rowan.edu/~shreek/fall04/ann/lab3.html

  13. Summary

More Related