首页> 外文会议>International Conference on Learning and Intelligent Optimization >Accelerated Randomized Coordinate Descent Algorithms for Stochastic Optimization and Online Learning
【24h】

Accelerated Randomized Coordinate Descent Algorithms for Stochastic Optimization and Online Learning

机译:随机优化和在线学习的加速随机坐标下降算法

获取原文

摘要

We propose accelerated randomized coordinate descent algorithms for stochastic optimization and online learning. Our algorithms have significantly less per-iteration complexity than the known accelerated gradient algorithms. The proposed algorithms for online learning have better regret performance than the known randomized online coordinate descent algorithms. Furthermore, the proposed algorithms for stochastic optimization exhibit as good convergence rates as the best known randomized coordinate descent algorithms. We also show simulation results to demonstrate performance of the proposed algorithms.
机译:我们提出用于随机优化和在线学习的加速随机坐标下降算法。与已知的加速梯度算法相比,我们的算法的每次迭代复杂度要低得多。所提出的在线学习算法比已知的随机在线坐标下降算法具有更好的后悔性能。此外,所提出的用于随机优化的算法表现出与最著名的随机坐标下降算法一样好的收敛速度。我们还显示了仿真结果,以证明所提出算法的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号