首页> 外文OA文献 >A new fast learning algorithm with promising global convergence capability for feed-forward neural networks
【2h】

A new fast learning algorithm with promising global convergence capability for feed-forward neural networks

机译:具有前馈全局收敛能力的前馈神经网络快速学习新算法

摘要

Backpropagation (BP) learning algorithm is the most widely used supervised learning technique that is extensively applied in the training of multi-layer feed-forward neural networks. Although many modifications of BP have been proposed to speed up the learning of the original BP, they seldom address the local minimum and the flat-spot problem. This paper proposes a new algorithm called Local-minimum and Flat-spot Problem Solver (LFPS) to solve these two problems. It uses a systematic approach to check whether a learning process is trapped by a local minimum or a flat-spot area, and then escape from it. Thus, a learning process using LFPS can keep finding an appropriate way to converge to the global minimum. The performance investigation shows that the proposed algorithm always converges in different learning problems (applications) whereas other popular fast learning algorithms sometimes give very poor global convergence capabilities.
机译:反向传播(BP)学习算法是最广泛使用的监督学习技术,已广泛应用于多层前馈神经网络的训练中。尽管已经提出了许多BP修改以加快对原始BP的学习,但是它们很少解决局部最小值和平坦点问题。为了解决这两个问题,本文提出了一种新的算法,称为局部最小和平坦点问题求解器(LFPS)。它使用系统的方法来检查学习过程是被局部最小值还是平坦区域所困,然后从中逃脱。因此,使用LFPS的学习过程可以不断寻找收敛到全局最小值的适当方法。性能研究表明,提出的算法总是收敛于不同的学习问题(应用程序),而其他流行的快速学习算法有时会给出非常差的全局收敛能力。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号