首页> 外文期刊>IMA Journal of Numerical Analysis >Optimal proximal augmented Lagrangian method and its application to full Jacobian splitting for multi-block separable convex minimization problems
【24h】

Optimal proximal augmented Lagrangian method and its application to full Jacobian splitting for multi-block separable convex minimization problems

机译:最佳的近端增强拉格朗日方法及其在多块间可分离凸起最小化问题中的全雅可比分裂的应用

获取原文
获取原文并翻译 | 示例
           

摘要

The augmented Lagrangian method (ALM) is fundamental in solving convex programming problems with linear constraints. The proximal version of ALM, which regularizes ALM’s subproblem over the primal variable at each iteration by an additional positive-definite quadratic proximal term, has been well studied in the literature. In this paper we show that it is not necessary to employ a positive-definite quadratic proximal term for the proximal ALM and the convergence can be still ensured if the positive definiteness is relaxed to indefiniteness by reducing the proximal parameter. An indefinite proximal version of the ALM is thus proposed for the generic setting of convex programming problems with linear constraints. We show that our relaxation is optimal in the sense that the proximal parameter cannot be further reduced. The consideration of indefinite proximal regularization is particularly meaningful for generating larger step sizes in solving ALM’s primal subproblems. When the model under discussion is separable in the sense that its objective function consists of finitely many additive function components without coupled variables, it is desired to decompose each ALM’s subproblem over the primal variable in Jacobian manner, replacing the original one by a sequence of easier and smaller decomposed subproblems, so that parallel computation can be applied. This full Jacobian splitting version of the ALM is known to be not necessarily convergent, and it has been studied in the literature that its convergence can be ensured if all the decomposed subproblems are further regularized by sufficiently large proximal terms. But how small the proximal parameter could be is still open. The other purpose of this paper is to show the smallest proximal parameter for the full Jacobian splitting version of ALM for solving multi-block separable convex minimization models.
机译:增强拉格朗日方法(ALM)是在求解线性约束的凸编程问题方面的基础。通过额外的正面迭代在每次迭代的原始变量上规则地将ALM的子问题进行正常的近端版本,在文献中得到了很好的研究。在本文中,我们表明,如果通过减少近端参数,如果通过减少近端参数,则仍然没有用于近端ALM的正定二次近期术语,并且可以仍然确保收敛。因此提出了一种无限期的ALM的近端版本,用于线性约束的凸编程问题的通用设置。我们表明我们的放松是最佳的,即近端参数不能进一步减少。对无限期近端正则化的考虑对于在解决ALM的原始子问题中产生更大的一步尺寸特别有意义。当讨论的模型是可分离的,因为它的目标函数由没有耦合变量的许多添加函数分量组成,所以希望以雅各的方式以雅各的方式在原始变量上分解每个ALM的子问题,通过更容易的序列替换原始的原始变量较小的分解子问题,从而可以应用并行计算。已知ALM的全雅可比分裂版本不一定是会聚,并且在文献中研究了其收敛,如果所有分解的子问题都通过足够大的近端术语进一步规则化。但是近端参数可能仍然是多么开放。本文的另一个目的是为ALM的全雅可比分离版本的全雅可比分离版本的最小近端参数求解求解多块可分离凸起最小化模型。

著录项

相似文献

  • 外文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号