首页> 外文会议>IEEE International Joint Conference on Neural Networks >Selective Negative Correlation Learning Algorithm for Incremental Learning
【24h】

Selective Negative Correlation Learning Algorithm for Incremental Learning

机译:增量学习的选择性负相关学习算法

获取原文

摘要

Negative correlation learning (NCL) is a successful scheme for constructing neural network ensembles. In batch learning mode, NCL outperforms many other ensemble learning approaches. Recently, NCL is also shown to be a potentially powerful approach to incremental learning, while the advantage of NCL has not yet been fully exploited. In this paper, we propose a selective NCL approach for incremental learning. In the proposed approach, the previously trained ensemble is cloned when a new data set presents and the cloned ensemble is trained on the new data set. Then, the new ensemble is combined with the previous ensemble and a selection process is applied to prune the whole ensemble to a fixedsize. Simulation results on several benchmark datasets show that the proposed algorithm outperforms two recent incremental learning algorithms based on NCL.
机译:负相关学习(NCL)是构建神经网络集合的成功方案。在批量学习模式下,NCL优于许多其他集合学习方法。最近,NCL也被证明是增量学习的潜在强大的方法,而NCL的优势尚未充分利用。在本文中,我们提出了一种选择性NCL方法,用于增量学习。在所提出的方法中,当新数据集培训新数据集礼物和克隆的集合时,先前培训的合奏被克隆。然后,新合奏与上一个集合组合,并应用了选择过程以将整个集合修剪到固定尺寸。若干基准数据集上的仿真结果表明,该算法优于基于NCL的最近增量学习算法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号