首页> 中文期刊> 《数学年刊B辑(英文版)》 >Convergence of Gradient Algorithms for Nonconvex C^(1+α) Cost Functions

Convergence of Gradient Algorithms for Nonconvex C^(1+α) Cost Functions

         

摘要

This paper is concerned with convergence of stochastic gradient algorithms with momentum terms in the nonconvex setting.A class of stochastic momentum methods,including stochastic gradient descent,heavy ball and Nesterov’s accelerated gradient,is analyzed in a general framework under mild assumptions.Based on the convergence result of expected gradients,the authors prove the almost sure convergence by a detailed discussion of the effects of momentum and the number of upcrossings.It is worth noting that there are not additional restrictions imposed on the objective function and stepsize.Another improvement over previous results is that the existing Lipschitz condition of the gradient is relaxed into the condition of H?lder continuity.As a byproduct,the authors apply a localization procedure to extend the results to stochastic stepsizes.

著录项

相似文献

  • 中文文献
  • 外文文献
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号