首页> 外文会议>International Conference on Intelligent Systems Design and Applications >An Improved Learning Algorithm Based on The Broyden-Fletcher-Goldfarb-Shanno (BFGS) Method For Back Propagation Neural Networks
【24h】

An Improved Learning Algorithm Based on The Broyden-Fletcher-Goldfarb-Shanno (BFGS) Method For Back Propagation Neural Networks

机译:一种改进的基于Broyden-Fletcher-Goldfarb-Shanno(BFGS)方法的改进的学习算法回到传播神经网络的方法

获取原文

摘要

The Broyden-Fletcher-Goldfarh-Shanno (BFGS) optimization algorithm usually used for nonlinear least squares is presented and is combined with the modified back propagation algorithm yielding a new fast training multilayer perceptron (MLP) algorithm (BFGS/AG). The approaches presented in the paper consist of three steps: (1) modification on standard back propagation algorithm by introducing "gain variation" term of the activation function, (2) Calculating the gradient descent on error with respect to the weights and gains values and (3) the determination of the new search direction by exploiting the information calculated by gradient descent in step (2) as well as the previous search direction. The new approach improved the training efficiency of back propagation algorithm by adaptively modifying the initial search direction. Performance of the proposed method is demonstrated by comparing to the Broyden-Fletcher-Goldfarb-Shanno algorithm from neural network toolbox for the chosen benchmark. The results show that the number of iterations required by this algorithm to converge is less than 15% of what is required by the standard BFGS and neural network toolbox algorithm. It considerably improves the convergence rate significantly faster because of it new efficient search direction
机译:通常用于非线性最小二乘法的Broyden-Fletcher-goldfarh-Shanno(BFGS)优化算法,并与修改的背部传播算法相结合,其产生了新的快速训练Multidayer Perceptron(MLP)算法(BFGS / AG)。本文中呈现的方法包括三个步骤:(1)通过引入激活函数的“增益变化”项来修改标准回波传播算法,(2)在权重和增益值的误差下计算梯度下降(3)通过利用步骤(2)中的梯度下降和先前的搜索方向来利用通过梯度下降计算的信息来确定新的搜索方向。新方法通过自适应地修改初始搜索方向来提高回波传播算法的训练效率。通过与所选基准的神经网络工具箱中的泡卷 - 弗莱彻 - Goldfarb-Shanno算法比较来证明所提出的方法的性能。结果表明,该算法所需的迭代次数收敛的数量小于标准BFG和神经网络工具箱算法所需的15%。由于新的高效搜索方向,它显着提高了收敛速度明显更快

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号