首页> 外文期刊>Applied mathematics and computation >A new conjugate gradient algorithm for training neural networks based on a modified secant equation
【24h】

A new conjugate gradient algorithm for training neural networks based on a modified secant equation

机译:基于修正割线方程的神经网络训练共轭梯度算法

获取原文
获取原文并翻译 | 示例
       

摘要

Conjugate gradient methods have been established as excellent neural network training methods, due to the simplicity of their iteration, numerical efficiency and their low memory requirements. In this work, we propose a conjugate gradient neural network training algorithm which guarantees sufficient descent using any line search, avoiding thereby the usually inefficient restarts. Moreover, it approximates the second order curvature information of the error surface with a high-order accuracy by utilizing a new modified secant condition. Under mild conditions, we establish that the global convergence of our proposed method. Experimental results provide evidence that our proposed method is in general superior to the classical conjugate gradient training methods and has a potential to significantly enhance the computational efficiency and robustness of the training process.
机译:共轭梯度法由于其迭代的简单性,数值效率和低内存需求而被确立为优秀的神经网络训练方法。在这项工作中,我们提出了一种共轭梯度神经网络训练算法,该算法可确保使用任何线搜索进行足够的下降,从而避免了通常效率不高的重启。此外,它通过利用新的修正割线条件以高阶精度近似误差表面的二阶曲率信息。在温和的条件下,我们确定了我们提出的方法的全局收敛性。实验结果提供了证据,表明我们提出的方法总体上优于经典的共轭梯度训练方法,并且有可能显着提高训练过程的计算效率和鲁棒性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号