...
首页> 外文期刊>Nonlinear Theory and Its Applications >A novel quasi-Newton-based optimization for neural network training incorporating Nesterov's accelerated gradient
【24h】

A novel quasi-Newton-based optimization for neural network training incorporating Nesterov's accelerated gradient

机译:基于Nesterov加速梯度的基于拟牛顿的新型神经网络训练优化

获取原文
           

摘要

This paper describes a novel quasi-Newton (QN) based accelerated technique for training of neural networks. Recently, Nesterov's accelerated gradient method has been utilized for the acceleration of the gradient-based training. In this paper the acceleration of the QN training algorithm is realized by the quadratic approximation of the error function incorporating the momentum term as Nesterov's method. It is shown that the proposed algorithm has a similar convergence property with the QN method. Neural network trainings for the function approximation and the microwave circuit modeling problems are presented to demonstrate the proposed algorithm. The method proposed here drastically improves the convergence speed of the conventional QN algorithm.
机译:本文介绍了一种基于拟牛顿(QN)的新型加速技术,用于训练神经网络。最近,内斯特罗夫的加速梯度方法已用于加速基于梯度的训练。在本文中,QN训练算法的加速是通过将动量项作为Nesterov方法的误差函数的二次逼近来实现的。结果表明,所提算法与QN方法具有相似的收敛性。提出了针对函数逼近和微波电路建模问题的神经网络训练,以证明所提出的算法。本文提出的方法大大提高了传统QN算法的收敛速度。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号