首页> 外国专利> An Effective Distributed Nesterov Gradient and Heavy-ball Double Acceleration Strategy for Convex Optimization Problem

An Effective Distributed Nesterov Gradient and Heavy-ball Double Acceleration Strategy for Convex Optimization Problem

机译:凸优化问题的有效分布式Nesterov梯度和重球双加速策略

摘要

#$%^&*AU2020101338A420200820.pdf#####Abstract With the advent of the big data era, the traditional centralized algorithms have the defects of high computing cost and low system operating efficiency. However, the distributed algorithms can effectively divide a complex task into several simple and easy tasks at little computational cost. In view of this situation, this patent proposes an effective distributed Nesterov gradient and heavy-ball double acceleration algorithm for convex optimization problem over random strongly connected unblanced multi-agent networks. The algorithm mainly comprises five parts: choosing appropriate parameters, initializing variables, local heavy-ball accelerated descent, Nesterov fast gradient method, and dynamic consensus. The algorithm set forth in the present invention utilizes gradient tracking mechanism, and then adds Nesterov gradient method and heavy-ball method to accelerate computing variables, so that the variable can reach the optimal solution quickly. When the local objective function is strongly convex and has Lipschitz continuous gradient, the proposed algorithm can linearly converge to the global optimal solution with a sufficiently small step-size and a proper momentum parameter. The invention lays a theoretical foundation for the application of distributed optimization, promotes the study of the algorithm's acceleration mechanism, and expands its application scope.C Start : Select the global cost function according to the application problem System initialization Each agent set k--0 and maximum number of iteration, kmax Given network adjacency matrix, cost function smooth coefficient, and strongly convex coefficient Select the appropriate step size and momentum parameters by the analytic relationship between system parameters Each agent receives/sends the information from in-neighbors/outneighbors Each agent updates its local variable and computes the gradient Each agent sets k--k+1 N kk,,nax? Y End Fig. 1
机译:#$%^&* AU2020101338A420200820.pdf #####抽象随着大数据时代的到来,传统的集中式算法存在以下缺陷:计算成本高,系统运行效率低。但是,分布式算法只需很少的计算就可以将一个复杂的任务有效地分为几个简单的任务成本。鉴于这种情况,该专利提出了一种有效的分布式Nesterov梯度和强随机凸优化问题的重球双加速度算法连接不平衡的多代理网络。该算法主要包括五个部分:选择适当的参数,初始化变量,局部重球加速下降,内斯特罗夫快速梯度法和动态共识。本发明中提出的算法利用梯度跟踪机制,然后将Nesterov梯度方法和重球方法添加到加速计算变量,以便该变量可以快速达到最佳解决方案。什么时候局部目标函数是强凸的并且具有Lipschitz连续梯度算法可以以足够小的步长线性收敛到全局最优解以及合适的动量参数本发明为应用奠定了理论基础分布式优化,促进了对算法加速机制的研究,以及扩大了其应用范围。C开始:选择全局成本函数根据应用问题系统初始化每个代理设置k--0和最大迭代次数,kmax给定网络邻接矩阵,成本函数平滑系数,以及强凸系数选择适当的步长,然后动量参数由系统之间的解析关系参数每个代理接收/发送来自邻居/外面的信息邻居每个代理更新其本地变量并计算梯度每个代理设置k–k + 1ñk> k,nax?ÿ结束图。1

著录项

相似文献

  • 专利
  • 外文文献
  • 中文文献
获取专利

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号