1 / 17

Soft Competitive Learning without Fixed Network Dimensionality

Soft Competitive Learning without Fixed Network Dimensionality. Jacob Chakareski and Sergey Makarov Rice University, Worcester Polytechnic Institute. Algorithms. Neural Gas Competitive Hebbian Learning Neural Gas + Competitive Hebbian Learning Growing Neural Gas. Neural Gas.

Download Presentation

Soft Competitive Learning without Fixed Network Dimensionality

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Soft Competitive Learning without Fixed Network Dimensionality Jacob Chakareski and Sergey Makarov Rice University, Worcester Polytechnic Institute

  2. Algorithms • Neural Gas • Competitive Hebbian Learning • Neural Gas + Competitive Hebbian Learning • Growing Neural Gas

  3. Neural Gas • Sorts the network units based on their distance from the input signal • Adapts a certain number of units, based on this “rank order” • The number of adapted units and the adaptation strength are decreased according to a fixed schedule

  4. The algorithm • Initialize a set A with N units ci • Sort the network units • Adapt the network units

  5. Simulation Results

  6. Competitive Hebbian Learning • Usually not used on its own, but in conjunction with other methods • It does not change reference vectors wj at all • It only generates a number of neighborhood edges between the units of the network

  7. The algorithm • Initialize a set A with N units ci and the connection set C • Determine units s1 and s2 • Create a connection between s1 and s2

  8. Simulation Results

  9. Neural Gas + CHL • A superposition of NG and CHL • Sometimes denoted as “topology-representing networks” • A local edge aging mechanism implemented to remove edges which are not valid anymore

  10. The algorithm • Set the age of the connection between s1 and s2 to zero (“refresh” the edge) • Increment the age of all edges emanating from s1 • Remove edges with an age larger than the current age T(t)

  11. Simulation Results

  12. Growing Neural Gas • Number of units changes (mostly increases) during the self-organization process • Starting with very few units new units are added successively • Local error measures are gathered to determine where to insert new units • Each new unit is inserted near the unit with the largest accumulated error

  13. The algorithm • Add the squared distance between the input signal and the winner to a local error variable • Adapt the winner and its neighbors • If the number of input signals generated so far is a multiple integer of a parameter , insert a new unit :

  14. Determine the unit with the max Err • Determine the neighbor of q with the max Err • Add a new unit r to the network • Insert edges connecting r with q and f, and remove the original edge between q and f • Decrease the error variables of q and f

  15. Interpolate the error variable of r from q and f • Decrease the error variables of all units

  16. Simulation Results

  17. Applications: Web/Database Maps

More Related