Loading in 5 sec....

A Hybrid Self-Organizing Neural Gas Network PowerPoint Presentation

A Hybrid Self-Organizing Neural Gas Network

- 80 Views
- Uploaded on

Download Presentation
## PowerPoint Slideshow about ' A Hybrid Self-Organizing Neural Gas Network ' - bethany-walters

**An Image/Link below is provided (as is) to download presentation**

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

### A Hybrid Self-Organizing Neural Gas Network

James Graham and Janusz Starzyk

School of EECS, Ohio University

Stocker Center, Athens, OH 45701 USA

IEEE World Conference on Computational Intelligence (WCCI’08)

June 1-6, 2008

Hong Kong

Introduction

- Self Organizing Networks
- Useful for representation building in unsupervised learning
- Useful for clustering, visualization and feature maps
- Numerous applications in surveillance, traffic monitoring, flight control, rescue mission, reinforcement learning, etc.

- Some Types of Self Organizing Networks
- Traditional Self-Organizing Map
- Parameterless SOM
- Neural Gas Network
- Growing Neural Gas
- Self-Organizing Neural Gas (SONG)

Description of the approach- Fritzke’s GNG Network Algorithm Highlights

- GNG starts with a set A of two units a and b at random positions wa and wb in Rn
- In the set A find two nearest neighbors s1and s2 to the input signal x.
- Connect s1 and s2, with an edge and set the edge age to zero.
- Adjust the positions of s1and its neighborhood by a constant times (x-s1). (b for s1 and nfor the neighborhood)
- Remove edges in the neighborhood that are older than amax.
- Place a new node every λ cycles between the node with greatest error and its nearest neighbor.
- Reduce error of the node with the maximum error and its nearest neighbor by %, and add the removed error to the new node.
- Reduce error of all nodes by a constant () times their current error.

Example

Example of Fritzke’s network results for 40,000 iterations with the following constants: b=0.05, n=.0006 , amax=88, =200, =.5, =0.0005.

Description of the approach- Proposed Hybrid SONG Network Algorithm Highlights

- SONG starts with a random pre-generated network of a fixed size.
- Connections get “stiffer” with age, making their weight harder to change.
- Error is calculated after the node position updates rather than before.
- Weight adjustment and error distribution are functions of a distance rather than arbitrary, hard to set constants.
- Edge connections are removed only under the following conditions:
- When a connection is added and the node has a long connection 2x greater than its average connection length - the long edge is removed.
- When a node is moved and has at least 2 connections (after attaching to its destination node) - its longest connection is removed.

Description of the approach- Modification of new data neighborhood

|w2-x|

|w1-x|

new data

x

s

nearest neighbor

|wN-x|

|ws-x|

remove connection to

a distant neighbor >2 mean

node removed

if orphaned

“Force” calculations

Weight adjustment

Error increase

Age increase by 1

Description of the approach- Node replacement

Select a node with the minimum

error Esk

Spread Esk to its sk neighborhood

maximum error node

sq

sk

minimum error

node moved

Description of the approach- Node replacement

Select a node with the minimum

error Esk

Spread Esk to its sk neighborhood

maximum error node

sq

sk

Insert sk to the neighborhood of sq

using weights

longest connection

removed

Remove the longest connection

Spread half of sq neighborhood error

to sk

Results

- Initial network structure with 1 random connection per node (for 200 nodes)

Results (cont.)

- Structure resulting form 1 initial random connection.

Results (cont.)

- Connection equilibrium reached for 1 initial connection.

Results (cont.)

- Structure resulting from 16 initial random connections.

Results (cont.)

- Connection equilibrium for 16 initial connections.

2-D comparison, with SOM network

Salient features of the SOM algorithm:

The SOM network starts as a predefined grid and is adjusted over many iterations.

Connections are fixed and nodes are not inserted, moved, or relocated out of their preexisting grid.

Weight adjustments occur over the entire grid and are controlled by weighted distance to the data point.

Results (cont.)Growing SONG Network

- Number of nodes in SONG can be automatically obtained
- The SONG network starts with a few randomly placed nodes and build itself up until an equilibrium is reached between the network size and the error.
- A node is added every λ cycles if
MaxError > AveError + Constant

- Equilibrium appears to be ~200 nodes.

Growing SONG Network (cont.)

- Error handling in growing SONG network was modified.
- The error is “reset” and recomputed after the equilibrium was reached
- Network continues to learn reaching new equilibrium
- Approximation accuracy vary from run to run

Growing SONG Network (cont.)

- The results of growing SONG network run (on the right) compared to the simpler static approach (on the left).

Other Applications- Sparsely connected hierarchical sensory network

- The major features of the SONG algorithm such as the weight adjustment, error calculation, and neighborhood selection are utilized in building self-organizing sparsely connected hierarchical networks.
- The sparse hierarchical network is locally connected based on neurons’ firing correlation
- Feedback and time based correlation are used for invariant object recognition.

Correlation/PDF Example

Other Applications- Sparsely connected hierarchical sensory network (cont.)Other Applications- Sparsely connected hierarchical network (cont.)

Correlation based wiring

Declining neurons’ activations

Sparse hierarchical representations

Conclusions

- The SONG algorithm is more biologically plausible than Fritzke’s GNG algorithm. Specifically:
- Weight and error adjustment are not parameter based.
- Connections become stiffer with age rather than being removed at a maximum age as in Fritzke’s method.
- Network has all neurons from the beginning

- SONG approximates data distribution faster than the other methods tested.
- Connectivity between neurons is automatically obtained and depends on the parameter that controls edge removal and the network size.
- The number of neurons can be automatically obtained in growing SONG to achieve the desired accuracy.

Future Work

- Adapt the SONG algorithm to large input spaces (high dimensionality, i.e. images)
- Adapt the SONG algorithm to a hierarchical network.
- Possible applications in feature extraction, representation building, and shape recognition.

- Insert new nodes as needed to reduce error.
- Optimize the network design.

Download Presentation

Connecting to Server..