首页> 外文期刊>IEEE Transactions on Automatic Control >Distributed Heavy-Ball: A Generalization and Acceleration of First-Order Methods With Gradient Tracking
【24h】

Distributed Heavy-Ball: A Generalization and Acceleration of First-Order Methods With Gradient Tracking

机译:分布式重球:具有梯度跟踪的一阶方法的泛化和加速度

获取原文
获取原文并翻译 | 示例
           

摘要

We study distributed optimization to minimize a sum of smooth and strongly-convex functions. Recent work on this problem uses gradient tracking to achieve linear convergence to the exact global minimizer. However, a connection among different approaches has been unclear. In this paper, we first show that many of the existing first-order algorithms are related with a simple state transformation, at the heart of which lies a recently introduced algorithm known as AB. We then present distributed heavy-ball, denoted as AB, that combines AB with a momentum term and uses nonidentical local step-sizes. By simultaneously implementing both row- and column-stochastic weights, AB removes the conservatism in the related work due to doubly stochastic weights or eigenvector estimation. AB thus naturally leads to optimization and average consensus over both undirected and directed graphs. We show that AB has a global R-linear rate when the largest step-size and momentum parameter are positive and sufficiently small. We numerically show that AB achieves acceleration, particularly when the objective functions are ill-conditioned.
机译:我们研究了分布式优化,以最大限度地减少平滑和强凸功能的总和。最近的解决问题使用梯度跟踪来实现对确切的全局最小值的线性收敛。然而,不同方法之间的连接尚不清楚。在本文中,我们首先表明,许多现有的一阶算法与一个简单的状态转换有关,其核心介于最近引入的算法。然后,我们将分布式重球表示为AB,其与动量术语结合在一起并使用非局部局部阶梯尺寸。通过同时实施行和列随机重量,AB由于双随机重量或特征向量估计而消除了相关工作中的保守主义。因此,AB自然地导致过度和有向图的优化和平均共识。当最大的阶梯大小和动量参数正为且足够小时,我们表明AB具有全局R-LINEAR率。我们在数值上表明AB实现加速,特别是当客观函数不均匀时。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号