首页> 外文期刊>Neural Networks, IEEE Transactions on >Error Minimized Extreme Learning Machine With Growth of Hidden Nodes and Incremental Learning
【24h】

Error Minimized Extreme Learning Machine With Growth of Hidden Nodes and Incremental Learning

机译:带有隐藏节点的增长和增量学习的错误最小化极限学习机

获取原文
获取原文并翻译 | 示例
           

摘要

One of the open problems in neural network research is how to automatically determine network architectures for given applications. In this brief, we propose a simple and efficient approach to automatically determine the number of hidden nodes in generalized single-hidden-layer feedforward networks (SLFNs) which need not be neural alike. This approach referred to as error minimized extreme learning machine (EM-ELM) can add random hidden nodes to SLFNs one by one or group by group (with varying group size). During the growth of the networks, the output weights are updated incrementally. The convergence of this approach is proved in this brief as well. Simulation results demonstrate and verify that our new approach is much faster than other sequential/incremental/growing algorithms with good generalization performance.
机译:神经网络研究中的开放问题之一是如何自动确定给定应用程序的网络体系结构。在本文中,我们提出了一种简单有效的方法来自动确定通用单隐藏层前馈网络(SLFN)中隐藏节点的数量,而无需像神经一样。这种称为错误最小化极限学习机(EM-ELM)的方法可以向SLFN逐个或逐组(具有变化的组大小)添加随机隐藏节点。在网络增长期间,输出权重将逐步更新。在此简短内容中也证明了这种方法的收敛性。仿真结果证明并验证了我们的新方法比具有良好泛化性能的其他顺序/增量/增长算法快得多。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号