...
首页> 外文期刊>Neurocomputing >Alleviating the problem of local minima in Backpropagation through competitive learning
【24h】

Alleviating the problem of local minima in Backpropagation through competitive learning

机译:通过竞争学习缓解反向传播中的局部极小问题

获取原文
获取原文并翻译 | 示例
           

摘要

The backpropagation (BP) algorithm is widely recognized as a powerful tool for training feedforward neural networks (FNNs). However, since the algorithm employs the steepest descent technique to adjust the network weights, it suffers from a slow convergence rate and often produces suboptimal solutions, which are the two major drawbacks of BP. This paper proposes a modified BP algorithm which can remarkably alleviate the problem of local minima confronted with by the standard BP (SBP). As one output of the modified training procedure, a bucket of all the possible solutions of weights matrices found during training is acquired, among which the best solution is chosen competitively based upon their performances on a validation dataset. Simulations are conducted on four benchmark classification tasks to compare and evaluate the classification performances and generalization capabilities of the proposed modified BP and SBP.
机译:反向传播(BP)算法被广泛认为是训练前馈神经网络(FNN)的强大工具。但是,由于该算法采用最速下降技术来调整网络权重,因此收敛速度较慢并且常常会产生次优解,这是BP的两个主要缺点。提出了一种改进的BP算法,可以显着减轻标准BP(SBP)所面临的局部极小值的问题。作为修改后的训练过程的一个输出,将获取训练过程中找到的所有权重矩阵的可能解决方案的存储桶,其中,根据解决方案在验证数据集上的性能,竞争性地选择最佳解决方案。在四个基准分类任务上进行了仿真,以比较和评估所提出的改进型BP和SBP的分类性能和泛化能力。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号