...
首页> 外文期刊>IEEE Transactions on Signal Processing >Network Newton Distributed Optimization Methods
【24h】

Network Newton Distributed Optimization Methods

机译:网络牛顿分布式优化方法

获取原文
获取原文并翻译 | 示例

摘要

We study the problem of minimizing a sum of convex objective functions, where the components of the objective are available at different nodes of a network and nodes are allowed to only communicate with their neighbors. The use of distributed gradient methods is a common approach to solve this problem. Their popularity notwithstanding, these methods exhibit slow convergence and a consequent large number of communications between nodes to approach the optimal argument because they rely on first-order information only. This paper proposes the network Newton (NN) method as a distributed algorithm that incorporates second-order information. This is done via distributed implementation of approximations of a suitably chosen Newton step. The approximations are obtained by truncation of the Newton step's Taylor expansion. This leads to a family of methods defined by the number K of Taylor series terms kept in the approximation. When keeping K terms of the Taylor series, the method is called NN-K and can be implemented through the aggregation of information in K-hop neighborhoods. Convergence to a point close to the optimal argument at a rate that is at least linear is proven and the existence of a tradeoff between convergence time and the distance to the optimal argument is shown. The numerical experiments corroborate reductions in the number of iterations and the communication cost that are necessary to achieve convergence relative to first-order alternatives.
机译:我们研究最小化凸目标函数之和的问题,其中目标的组成部分在网络的不同节点上都可用,并且节点仅能与其邻居通信。使用分布式梯度方法是解决此问题的常用方法。尽管它们很受欢迎,但是由于它们仅依赖于一阶信息,因此这些方法收敛速度较慢,并在节点之间进行大量通信以逼近最佳参数。本文提出了网络牛顿(NN)方法作为一种包含二阶信息的分布式算法。这是通过适当选择的牛顿步长近似值的分布式实现来完成的。通过截断牛顿步骤的泰勒展开式获得近似值。这导致了一系列方法,这些方法由近似中保持的泰勒级数项的数量K定义。当保留泰勒级数的K个项时,该方法称为NN-K,可以通过K-hop邻域中的信息聚合来实现。证明了以至少线性的速率收敛到接近最优自变量的点,并且表明了收敛时间和到最优自变量的距离之间的折衷。数值实验证实了相对于一阶替代方案而言,实现收敛所必需的迭代次数和通信成本的减少。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号