首页> 外文会议>International Joint Conference on Neural Networks >Enhanced Two-Phase Method in Fast Learning Algorithms
【24h】

Enhanced Two-Phase Method in Fast Learning Algorithms

机译:快速学习算法中增强的两相方法

获取原文

摘要

Backpropagation (BP) learning algorithm is the most widely supervised learning technique which is extensively applied in the training of multi-layer feed-forward neural networks. Many modifications of BP have been proposed to speed up the learning of the original BP. However, the performance of these modifications is still not promising due to the existence of the local minimum problem and the error overshooting problem. This paper proposes an Enhanced Two-Phase method to solve these two problems to improve the performance of existing fast learning algorithms. The proposed method effectively locates the existence of the above problems and assigns appropriate fast learning algorithms to solve them. Throughout our investigation, the proposed method significantly improves the performance of different fast learning algorithms in terms of the convergence rate and the global convergence capability in different problems. The convergence rate can be increased up to 100 times compared with the existing fast learning algorithms.
机译:BackPropagation(BP)学习算法是最广泛监督的学习技术,广泛应用于多层前馈神经网络的训练。已经提出了对BP的许多修改来加速原始BP的学习。然而,由于存在局部最小问题和错误的过冲问题,这些修改的性能仍然没有希望。本文提出了一种增强的两相方法来解决这两个问题,以提高现有快速学习算法的性能。所提出的方法有效地定位了上述问题的存在,并分配适当的快速学习算法来解决它们。在我们的研究中,所提出的方法在不同问题中显着提高了不同快速学习算法的性能和不同问题的全局收敛能力。与现有的快速学习算法相比,收敛速率最多可增加至100倍。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号