首页> 外文会议>International conference on neural information processing >Analytical Incremental Learning: Fast Constructive Learning Method for Neural Network
【24h】

Analytical Incremental Learning: Fast Constructive Learning Method for Neural Network

机译:分析增量学习:神经网络的快速建构学习方法

获取原文

摘要

Extreme learning machine (ELM) is a fast learning algorithm for single hidden layer feed-forward neural network (SLFN) based on random input weights which usually requires large number of hidden nodes. Recently, novel constructive and destructive parsimonious (CP and DP)-ELM which provide the effectiveness generalization and compact hidden nodes have been proposed. However, the performance might be unstable due to the randomization either in ordinary ELM or CP and DP-ELM. In this study, analytical incremental learning (AIL) algorithm is proposed in which all weights of neural network are calculated analytically without any randomization. The hidden nodes of AIL are incrementally generated based on residual error using least square (LS) method. The results show the effectiveness of AIL which has not only smallest number of hidden nodes and more stable but also good generalization than those of ELM, CP and DP-ELM based on seven benchmark data sets evaluation.
机译:极限学习机(ELM)是一种基于随机输入权重的单隐藏层前馈神经网络(SLFN)的快速学习算法,通常需要大量的隐藏节点。近来,已经提出了新颖的建设性和破坏性简约(CP和DP)-ELM,其提供了有效性概括和紧凑的隐藏节点。但是,由于普通ELM或CP和DP-ELM中的随机性,性能可能不稳定。在这项研究中,提出了分析增量学习(AIL)算法,其中神经网络的所有权重都经过分析计算而没有任何随机化。使用最小二乘(LS)方法根据残差增量生成AIL的隐藏节点。结果表明,基于七个基准数据集评估,AIL的有效性不仅具有最小的隐藏节点数量,而且更稳定,而且比ELM,CP和DP-ELM具有更好的泛化能力。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号