1 / 11

Looking inside self-organizing map ensembles with resampling and negative correlation learning

Looking inside self-organizing map ensembles with resampling and negative correlation learning. Alexandra Scherbart , Tim W. Nattkemper NN, Vol.24 2011, pp. 130–141 Presenter : Wei- Shen Tai 20 10 / 12/22. Outline . Introduction Methods Results Discussion Conclusions Comments .

avent
Download Presentation

Looking inside self-organizing map ensembles with resampling and negative correlation learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Looking inside self-organizing map ensembles with resampling and negative correlation learning Alexandra Scherbart , Tim W. Nattkemper NN, Vol.24 2011, pp. 130–141 Presenter : Wei-Shen Tai 2010/12/22

  2. Outline • Introduction • Methods • Results • Discussion • Conclusions • Comments

  3. Motivation • Dilemma of ensemble learning methods • Balance the diversity against single learner accuracy • Maximize the diversity may worsen the prediction performance of every single learner. • Minimize the prediction error of ensemble member leads to very similar nets. f1 x f2 y f3

  4. Objective • Negative correlation learning (NCL) in EL • Allows a balance between single network accuracy and diversity controlled by the co-operation of neural networks. ensemble error of EL f1 error of individual network x f2 y f3

  5. Negative correlation learning (NCL) • Additional penalty term • Balance between accuracy of individual networks and the quantified ambiguity (diversity). • γ is a parameter controlling the penalty for a high correlation of the individual networks errors (low diversity).

  6. Penalty functions • Two penalty functions derived from NCL • Minimize those penalty functions

  7. Proposed architecture • Random starting initialization of each node • Applies two resampling methods, bagging (bootstrap aggregating) and the random subspace mMethod (RSM) to enforce differences between the ensemble members • RSM obtains the subset of features randomly • A SOM is regarded as a EL • The intra-member (i.e., intra-SOM) diversity is an intrinsic property of the SOM, as a single network can be seen as an ensemble itself. • In the case of NCL, the ensemble members interact and are forced to follow different trajectories in hypothesis space. f1 x f2 y f3

  8. Results

  9. Discussion • A low number of nodes in a network • Increasing the size of the networks leads to a loss in generalization (ensemble) performance, since every net gets too specialized to the presented subtask. • Small neighbor size σ • Individual nodes are forced to specialize locally. • Single ensemble accuracy is improved by forcing the diversity along each individual predictor. • A small k • Predictors are no longer capable of extracting the relevant information for modeling the particular subtask at hand.

  10. Conclusions • A novel SOM-based EL • Introduce the concepts of negative correlation learning (NCL) into the field of SOM ensemble learning. • Diversity in intra-SOM and inter-SOM • Explicit diversity-forcing impact is caused by NCL on the inter-SOM level. • By the combination of the two resampling methods, bagging and RSM, the diversity between SOMs is enforced implicitly. • The implicit diversity inside the SOMs, which is controlled by the width σ of the Gaussian neighborhood function.

  11. Comments • Advantage • This proposed model overcomes the conflict between the accuracy of single ensemble member and diversity in ensemble learning methods. • Drawback • The intra-SOM diversity is increased by decreasing the neighbor size σ . The authors did not mention about how to increase the inter-SOM diversity during the training. • Only the error between ensemble output and the prediction of a single member is considered in two penalty functions. Nevertheless, the above-mentioned inter-SOM diversity is ignored in both of them. • Less map size and neighbor size will cause low accuracy of each SOM. • Application • SOM-based ensemble learning issue.

More Related