...
首页> 外文期刊>Stochastic Processes and Their Applications: An Official Journal of the Bernoulli Society for Mathematical Statistics and Probability >Convergence and convergence rate of stochastic gradient search in the case of multiple and non-isolated extrema
【24h】

Convergence and convergence rate of stochastic gradient search in the case of multiple and non-isolated extrema

机译:极值非孤立的情况下随机梯度搜索的收敛速度和收敛速度

获取原文
获取原文并翻译 | 示例
           

摘要

The asymptotic behavior of stochastic gradient algorithms is studied. Relying on results from differential geometry (the Lojasiewicz gradient inequality), the single limit-point convergence of the algorithm iterates is demonstrated and relatively tight bounds on the convergence rate are derived. In sharp contrast to the existing asymptotic results, the new results presented here allow the objective function to have multiple and non-isolated minima. The new results also offer new insights into the asymptotic properties of several classes of recursive algorithms which are routinely used in engineering, statistics, machine learning and operations research. (C) 2014 Elsevier B.V. All rights reserved.
机译:研究了随机梯度算法的渐近行为。依靠微分几何的结果(Lojasiewicz梯度不等式),证明了算法迭代的单极限点收敛,并且得出了收敛速度的相对严格的边界。与现有渐近结果形成鲜明对比的是,此处介绍的新结果使目标函数具有多个且非隔离的最小值。新结果还提供了对几类递归算法的渐近性质的新见解,这些递归算法通常用于工程,统计,机器学习和运筹学中。 (C)2014 Elsevier B.V.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号