首页> 外文会议>IEEE Data Driven Control and Learning Systems Conference >New Stochastic Gradient Descent Algorithm via Lagrange-type 1-step-ahead Numerical Differentiation
【24h】

New Stochastic Gradient Descent Algorithm via Lagrange-type 1-step-ahead Numerical Differentiation

机译:新的随机梯度下降算法通过拉格朗日型的1-一步 - 前方数值差异化

获取原文
获取外文期刊封面目录资料

摘要

The method that stochastic gradient descent (SGD) algorithm uses to update the undetermined parameter in the iterative process can be viewed as a rudimentary forward Euler method in the perspective of numerical differentiation. In order to overcome the connatural imperfection of the forward differentiation rule and computation error of the SGD algorithm, a new algorithm is obtained by substituting the original updating rule in SGD with the Lagrange-step l-step-ahead differentiation rule. In addition, extensive experiments between the original SGD algorithm and the modified algorithm are conducted to analyze the convergence. Empirical results demonstrate that Lagrange-step 1-step-ahead can not be used to the SGD algorithm and the new algorithm does not converge. Theoretical analysis is given to explain this result.
机译:随机梯度下降(SGD)算法用于更新迭代过程中的未确定参数的方法可以在数值分化的角度被视为基本的前向欧拉方法。 为了克服SGD算法的前向区分规则和计算误差的连接缺陷,通过用Lagrange-Step L-前瞻性分化规则代替SGD中的原始更新规则来获得新算法。 另外,进行了原始SGD算法与修改算法之间的广泛实验以分析收敛性。 经验结果表明,Lagrange-Step 1-前方不能用于SGD算法,并且新算法不收敛。 给出了理论分析来解释这一结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号