首页> 外文会议>49th IEEE Conference on Decision and Control >Convergence and convergence rate of stochastic gradient search in the case of multiple and non-isolated extrema
【24h】

Convergence and convergence rate of stochastic gradient search in the case of multiple and non-isolated extrema

机译:极值非孤立的情况下随机梯度搜索的收敛速度和收敛速度

获取原文

摘要

The asymptotic behavior of stochastic gradient algorithms is studied. Relying on some results of differential geometry (Lojasiewicz gradient inequality), the almost sure point-convergence is demonstrated and relatively tight almost sure bounds on the convergence rate are derived. In sharp contrast to all existing result of this kind, the asymptotic results obtained here do not require the objective function (associated with the stochastic gradient search) to have an isolated minimum at which the Hessian of the objective function is strictly positive definite. Using the obtained results, the asymptotic behavior of recursive prediction error identification methods is analyzed.
机译:研究了随机梯度算法的渐近行为。根据微分几何的一些结果(Lojasiewicz梯度不等式),证明了几乎确定的点收敛,并且得出了收敛速度相对严格的几乎确定的边界。与所有此类现有结果形成鲜明对比的是,此处获得的渐近结果不需要目标函数(与随机梯度搜索相关联)具有孤立的最小值,在该最小值下,目标函数的Hessian严格为正定。利用获得的结果,分析了递归预测误差识别方法的渐近行为。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号