首页> 外文期刊>IEEE Transactions on Neural Networks >An accelerated learning algorithm for multilayer perceptron networks
【24h】

An accelerated learning algorithm for multilayer perceptron networks

机译:多层感知器网络的加速学习算法

获取原文
获取原文并翻译 | 示例
           

摘要

An accelerated learning algorithm (ABP-adaptive back propagation) is proposed for the supervised training of multilayer perceptron networks. The learning algorithm is inspired from the principle of "forced dynamics" for the total error functional. The algorithm updates the weights in the direction of steepest descent, but with a learning rate a specific function of the error and of the error gradient norm. This specific form of this function is chosen such as to accelerate convergence. Furthermore, ABP introduces no additional "tuning" parameters found in variants of the backpropagation algorithm. Simulation results indicate a superior convergence speed for analog problems only, as compared to other competing methods, as well as reduced sensitivity to algorithm step size parameter variations.
机译:提出了一种用于多层感知器网络的有监督训练的加速学习算法(ABP自适应反向传播)。学习算法是从“强制动力学”原理中获得总误差函数的启发。该算法在最速下降的方向上更新权重,但是以学习率来确定误差和误差梯度范数的特定函数。选择该功能的这种特定形式以加速收敛。此外,ABP不会在反向传播算法的变体中引入任何其他“调整”参数。仿真结果表明,与其他竞争方法相比,仅针对模拟问题的收敛速度更快,并且对算法步长参数变化的敏感性降低。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号