首页> 外文会议>Proceedings of the 13th world congress >LEAST-SQUARES AND STOCHASTIC GRADIENT ALGORITHMS HAVE THE SAME CONVERGENCE RATE ORDER
【24h】

LEAST-SQUARES AND STOCHASTIC GRADIENT ALGORITHMS HAVE THE SAME CONVERGENCE RATE ORDER

机译:最小二乘和随机梯度算法具有相同的收敛速率阶

获取原文
获取外文期刊封面目录资料

摘要

This paper presents novel results on the almost sure convergence of the stochastic gradient based self-turning controller. The main focus of the paper is to evaluate convergence rate of the parameter estimates. It is proved that, surprisingly, this rate has the same order as the best one established for the least-squares algorithm, and it is the same as that in the laws of iterated logarithm. Although we consider the case of self-tuning controllers, presented results can easily be extended to some other adaptive processes.
机译:本文针对基于随机梯度的自转向控制器的几乎确定的收敛性提出了新颖的结果。本文的主要重点是评估参数估计的收敛速度。令人惊讶地证明,该速率与为最小二乘算法建立的最佳速率具有相同的阶数,并且与迭代对数定律中的阶数相同。尽管我们考虑了自整定控制器的情况,但所给出的结果可以轻松地扩展到其他一些自适应过程。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号