First-order methods have been popularly used for solving large-scaleproblems. However, many existing works only consider unconstrained problems orthose with simple constraint. In this paper, we develop two first-order methodsfor constrained convex programs, for which the constraint set is represented byaffine equations and smooth nonlinear inequalities. Both methods are based onthe classic augmented Lagrangian function. They update the multipliers in thesame way as the augmented Lagrangian method (ALM) but employ different primalvariable updates. The first method, at each iteration, performs a singleproximal gradient step to the primal variable, and the second method is a blockupdate version of the first one. For the first method, we establish its global iterate convergence as well asglobal sublinear and local linear convergence, and for the second method, weshow a global sublinear convergence result in expectation. Numericalexperiments are carried out on the basis pursuit denoising and a convexquadratically constrained quadratic program to show the empirical performanceof the proposed methods. Their numerical behaviors closely match theestablished theoretical results.
展开▼