首页> 外文会议>IEEE International Conference on Acoustics, Speech and Signal Processing >Rapid: Rapidly accelerated proximal gradient algorithms for convex minimization
【24h】

Rapid: Rapidly accelerated proximal gradient algorithms for convex minimization

机译:快速:快速加速的近端梯度算法,用于凸极小化

获取原文

摘要

In this paper, we propose a new algorithm to speed-up the convergence of accelerated proximal gradient (APG) methods. In order to minimize a convex function ƒ(x), our algorithm introduces a simple line search step after each proximal gradient step in APG so that a biconvex function ƒ(θx) is minimized over scalar variable θ > 0 while fixing variable x. We propose two new ways of constructing the auxiliary variables in APG based on the intermediate solutions of the proximal gradient and the line search steps. We prove that at arbitrary iteration step t(t ≥ 1), our algorithm can achieve a smaller upper-bound for the gap between the current and optimal objective values than those in the traditional APG methods such as FISTA [1], making it converge faster in practice. We apply our algorithm to many important convex optimization problems such as sparse linear regression. Our experimental results demonstrate that our algorithm converges faster than APG, even comparable to some sophisticated solvers.
机译:在本文中,我们提出了一种新算法来加速加速近端梯度(APG)方法的收敛。为了最小化凸函数ƒ(x),我们的算法在APG中的每个近端梯度步骤之后引入了一个简单的线搜索步骤,以便在固定变量x的同时在标量变量θ> 0时将双凸函数ƒ(θx)最小化。我们提出了两种基于近端梯度和线搜索步骤的中间解构造APG中辅助变量的新方法。我们证明,在任意迭代步长t(t≥1)上,与传统APG方法(例如FISTA [1])相比,我们的算法可以实现较小的当前目标值和最佳目标值之间的距离上限。在实践中更快。我们将算法应用于许多重要的凸优化问题,例如稀疏线性回归。我们的实验结果表明,我们的算法的收敛速度比APG快,甚至可以与某些复杂的求解器相媲美。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号