1 / 15

报告人:马艳东

The Improved Localized Generalization Error Model and Its Applications to Feature Selection for RBFNN. 报告人:马艳东. 指导教师:王熙照教授. 目录. 阿 Wing 的 L-GEM 基于范数的模型 Old model VS New model Algorithm for RBFNN Feature selection Simulation Conclusion Future Work. 阿 Wing 的 L-GEM.

Download Presentation

报告人:马艳东

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Improved Localized Generalization Error Model and Its Applications to Feature Selection for RBFNN 报告人:马艳东 指导教师:王熙照教授

  2. 目录 • 阿Wing的L-GEM • 基于范数的模型 • Old model VS New model • Algorithm for RBFNN Feature selection • Simulation • Conclusion • Future Work

  3. 阿Wing的L-GEM Wing et al. ignore the unseen samples located far from the training samples and compute the generalization error within the Q-neighborhood.

  4. Here,

  5. NL-GEM

  6. Old model VS New model

  7. Algorithm for RBFNN Feature selection based on • Step1. Initialize the IFS to be the full set of features; • Step2. Train the classifier using the dataset with features in IFS; • Step3. For each i: • Compute the for the classifier trained in Step2 without the i-th feature; • Step4. Delete the z-th feature from IFS if the for the z-th feature is the smallest among all choices; • Step5. If the stopping criterion is not satisfied, go to Step2.

  8. Simulation

  9. Conclusion • NL-GEM becomes simpler and easier to derivate, understand and realize while the performance is satisfying . • But the time complexity is not low

  10. Future Work • to do more experiments to test the validity of this model • to make use of MSE to realize the model

  11. 亟待解决的问题 • to improve the performance of time complexity • 方法一:另起炉灶,重新推导模型 • 方法二:用其他方法计算 • (1)按范数计算。 • 两种最常用的方法,Mente-Calro和网格法在time complexity上都是不太好; • (2)统计的方法。 • 的 不好去掉。

  12. Thanks!

  13. Holder不等式

More Related