首页>
外国专利>
An Effective Distributed Nesterov Gradient and Heavy-ball Double Acceleration Strategy for Convex Optimization Problem
An Effective Distributed Nesterov Gradient and Heavy-ball Double Acceleration Strategy for Convex Optimization Problem
展开▼
机译:凸优化问题的有效分布式Nesterov梯度和重球双加速策略
展开▼
页面导航
摘要
著录项
相似文献
摘要
#$%^&*AU2020101338A420200820.pdf#####Abstract With the advent of the big data era, the traditional centralized algorithms have the defects of high computing cost and low system operating efficiency. However, the distributed algorithms can effectively divide a complex task into several simple and easy tasks at little computational cost. In view of this situation, this patent proposes an effective distributed Nesterov gradient and heavy-ball double acceleration algorithm for convex optimization problem over random strongly connected unblanced multi-agent networks. The algorithm mainly comprises five parts: choosing appropriate parameters, initializing variables, local heavy-ball accelerated descent, Nesterov fast gradient method, and dynamic consensus. The algorithm set forth in the present invention utilizes gradient tracking mechanism, and then adds Nesterov gradient method and heavy-ball method to accelerate computing variables, so that the variable can reach the optimal solution quickly. When the local objective function is strongly convex and has Lipschitz continuous gradient, the proposed algorithm can linearly converge to the global optimal solution with a sufficiently small step-size and a proper momentum parameter. The invention lays a theoretical foundation for the application of distributed optimization, promotes the study of the algorithm's acceleration mechanism, and expands its application scope.C Start : Select the global cost function according to the application problem System initialization Each agent set k--0 and maximum number of iteration, kmax Given network adjacency matrix, cost function smooth coefficient, and strongly convex coefficient Select the appropriate step size and momentum parameters by the analytic relationship between system parameters Each agent receives/sends the information from in-neighbors/outneighbors Each agent updates its local variable and computes the gradient Each agent sets k--k+1 N kk,,nax? Y End Fig. 1
展开▼