1 / 23

OPTIMIZED INTERCONNECTIONS IN PROBABILISTIC SELF-ORGANIZING LEARNING

OPTIMIZED INTERCONNECTIONS IN PROBABILISTIC SELF-ORGANIZING LEARNING. Janusz Starzyk, Mingwei Ding, Haibo He School of EECS Ohio University, Athens, OH February 14-16, 2005 Innsbruck, Austria. OUTLINE. Introduction Self-organizing neural network structure Optimal and fixed input weights

azura
Download Presentation

OPTIMIZED INTERCONNECTIONS IN PROBABILISTIC SELF-ORGANIZING LEARNING

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. OPTIMIZED INTERCONNECTIONS IN PROBABILISTIC SELF-ORGANIZING LEARNING Janusz Starzyk, Mingwei Ding, Haibo He School of EECS Ohio University, Athens, OH February 14-16, 2005Innsbruck, Austria

  2. OUTLINE • Introduction • Self-organizing neural network structure • Optimal and fixed input weights • Optimal weights • Binary weights • Simulation results • Financial data analysis • Power quality classification • Hardware platform development • Conclusion

  3. Self-Organizing Learning Array (SOLAR ) • SOLAR CHARACTERISTICS • Entropy based self-organization • Dynamical reconfiguration • Local and sparse interconnections • Online inputs selection • Feature neurons and merging neurons

  4. SOLAR Hardware Structure

  5. Neuron Structure

  6. Self-organization • Each neuron has the ability to self-organize according to received information • Functionality – chose internal arithmetic and logic functions • Input selection – chose input connections

  7. Input selection • Merging neurons receive inputs from previous layers • Variable probability of received information correctness • Two input selection strategies • Random selection • Greedy selection

  8. Input Weighting • Weighted signal merging: S1, n0 W1 W2 N S2 , n0 Wn Sn , n0 Signal energy Noise energy

  9. Input Weighting (cont’d) • Objective function • Maximize the energy/noise ratio Set gradient of the objective function to 0 for i=1, 2, … n *

  10. Optimum Input weighting • 2-class classification problem • Each neuron receives recognition rate p from the previous layer When p=0.5, least information, it can be either class When p=0 or 1, most information, knows which class data belong to for sure Define Signal/noise ratio

  11. Optimum Input weighting (cont’d) • Using the optimization result Optimum weight: Weighted output: Solve and represents our belief that result belong to class 1

  12. Optimum Input weighting (cont’d) Example: Consider 3 inputs to a neuron with correct classification probabilities equal to pi Estimated output probability pout for various input probabilities is as follows

  13. Binary weighting • Simplified selection algorithm is desired for hardware implementation • Choose 0 or 1 as the weights for all the connected inputs This equation can be used to study the effect of adding or removing connections of different signal strength

  14. Pmax 0.5 Pcomb Pmix 0.5 Binary weighting (cont’d) • A stronger connection Pmax a weaker connection Pmix • Criteria for adding weaker connection Gain of information for different Pmax and Pmix Threshold for adding a new connection

  15. Pmax=0.69 0.5 Pcomb 0.5 Pmix>0.60 0.5 Pmix<0.60 Binary weighting (cont’d) • From previous results, selection criteria for binary weighting can be established. Threshold for adding a weaker connection Pmax=0.69

  16. Simulation results • Case I: Prediction of financial performance • Based on S&P Research Insight Database • More than 10,000 companies included • Training and testing on 3-year period • 192 features extracted • Kernel PCA used to reduce 192 features to 13~15 Fig. from http://goldennumber.net/stocks.htm

  17. Simulation results (cont’d) Training and testing data structure: Test result:

  18. Simulation results (cont’d) People spill out onto Madison Avenue in New York after blackout hit. (4:00 pm, 14, August, 2003, CNN Report) • Case II: Power quality disturbance classification problem Cars stopped about three-quarters of the way up the first hill of the Magnum XL200 ride at Cedar Point Amusement Park in Sandusky, Ohio. (15, August, 2003, CNN Report) THE COST:According to the North American Electric Reliability Council (NERC)

  19. Simulation results (cont’d) • Formulation of the problem: • Wavelet Multiresolution Analysis (MRA) is used for feature vector construction • 7 classes classification problem: Undisturbed sinusoid (normal); swell; sag; harmonics; outage; sag with harmonic; swell with harmonic • Two hundred cases of each class were generated for training and another 200 cases were generated for testing.

  20. Simulation results (cont’d) Reference [16]:T. K. A. Galil et. al, “Power Quality Disturbance Classification Using the Inductive Inference Approach, ” IEEE Transactions On Power Delivery, Vol.19, No.4, October, 2004

  21. Hardware Development XILINX VIRTEX XCV 1000

  22. Conclusion • Input selection strategy • Optimum weighting scheme theory • Simple binary weighting for practical use • Searching criteria for useful connections • Application study • Hardware platform design

  23. Questions?

More Related