首页> 外文会议>Annual conference on Neural Information Processing Systems >HONOR: Hybrid Optimization for NOn-convex Regularized problems
【24h】

HONOR: Hybrid Optimization for NOn-convex Regularized problems

机译:荣誉:NOn凸正则化问题的混合优化

获取原文

摘要

Recent years have witnessed the superiority of non-convex sparse learning formulations over their convex counterparts in both theory and practice. However, due to the non-convexity and non-smoothness of the regularizer, how to efficiently solve the non-convex optimization problem for large-scale data is still quite challenging. In this paper, we propose an efficient Hybrid Optimization algorithm for NOn-convex Regularized problems (HONOR). Specifically, we develop a hybrid scheme which effectively integrates a Quasi-Newton (QN) step and a Gradient Descent (GD) step. Our contributions are as follows: (1) HONOR incorporates the second-order information to greatly speed up the convergence, while it avoids solving a regularized quadratic programming and only involves matrix-vector multiplications without explicitly forming the inverse Hessian matrix. (2) We establish a rigorous convergence analysis for HONOR, which shows that convergence is guaranteed even for non-convex problems, while it is typically challenging to analyze the convergence for non-convex problems. (3) We conduct empirical studies on large-scale data sets and results demonstrate that HONOR converges significantly faster than state-of-the-art algorithms.
机译:近年来,目睹了非凸稀疏学习公式在其凸凸形式上的优越性,无论是在理论上还是在实践上。然而,由于正则化器的非凸性和非光滑性,如何有效地解决大规模数据的非凸优化问题仍然是一个很大的挑战。在本文中,我们提出了一种有效的混合优化算法,用于求解NOn凸正则化问题(HONOR)。具体来说,我们开发了一种混合方案,可以有效地整合拟牛顿(QN)步骤和梯度下降(GD)步骤。我们的贡献如下:(1)HONOR合并了二阶信息以极大地加快收敛速度​​,同时避免了求解规则化的二次规划,并且仅涉及矩阵向量乘法而没有明确形成逆黑森州矩阵。 (2)我们对HONOR建立了严格的收敛分析,这表明即使对于非凸问题,也能保证收敛,而对非凸问题的收敛通常具有挑战性。 (3)我们对大型数据集进行了实证研究,结果表明,HONOR的收敛速度远高于最新算法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号