首页> 外文期刊>IEEE Transactions on Automatic Control >Accelerated Distributed Nesterov Gradient Descent
【24h】

Accelerated Distributed Nesterov Gradient Descent

机译:加速分布式Nesterov梯度下降

获取原文
获取原文并翻译 | 示例

摘要

This paper considers the distributed optimization problem over a network, where the objective is to optimize a global function formed by a sum of local functions, using only local computation and communication. We develop an accelerated distributed Nesterov gradient descent method. When the objective function is convex and L-smooth, we show that it achieves a O(1/t(1.4-epsilon)) convergence rate for all epsilon is an element of (0, 1.4). We also show the convergence rate can be improved to O(1/t(2)) if the objective function is a composition of a linear map and a strongly convex and smooth function. When the objective function is mu-strongly convex and L-smooth, we show that it achieves a linear convergence rate of O([1 - C(mu/L)(5/7)](t)), where L/mu is the condition number of the objective, and C > 0 is some constant that does not depend on L/mu.
机译:本文考虑了网络上的分布式优化问题,其中目标是使用仅使用本地计算和通信来优化由本地功能的总和形成的全局函数。我们开发了加速分布式的Nesterov梯度下降方法。当目标函数是凸和光滑的时,我们表明它达到了所有ε的o(1 / t(1.4-epsilon))收敛速率是(0,1.4)的元素。如果目标函数是线性图的组成和强凸和平滑的功能,我们还显示收敛速率可以改善为O(1 / T(2)))。当客观函数是MU - 强凸和L光滑时,我们表明它实现了O的线性会聚速率([1 - C(mu / l)(5/7)](t)),其中l / mu是目标的条件数,C> 0是一些不依赖于L / MU的常数。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号