...
首页> 外文期刊>Automatica >Analytical convergence regions of accelerated gradient descent in nonconvex optimization under Regularity Condition
【24h】

Analytical convergence regions of accelerated gradient descent in nonconvex optimization under Regularity Condition

机译:规律性条件下非渗透优化加速梯度下降的分析趋同区

获取原文
获取原文并翻译 | 示例
           

摘要

There is a growing interest in using robust control theory to analyze and design optimization and machine learning algorithms. This paper studies a class of nonconvex optimization problems whose cost functions satisfy the so-called Regularity Condition (RC). Empirical studies show that accelerated gradient descent (AGD) algorithms (e.g. Nesterov's acceleration and Heavy-ball) with proper initializations often work well in practice. However, the convergence of such AGD algorithms is largely unknown in the literature. The main contribution of this paper is the analytical characterization of the convergence regions of AGD under RC via robust control tools. Since such optimization problems arise frequently in many applications such as phase retrieval, training of neural networks and matrix sensing, our result shows promise of robust control theory in these areas. (C) 2019 Elsevier Ltd. All rights reserved.
机译:利用鲁棒控制理论越来越兴趣分析和设计优化和机器学习算法。 本文研究了一类非耦合优化问题,其成本函数满足所谓的规则性条件(RC)。 实证研究表明,加速梯度下降(AGD)算法(例如,Nesterov的加速度和重球),具有适当的初始化通常在实践中运作良好。 然而,在文献中,这种agd算法的收敛性很大程度上是未知的。 本文的主要贡献是通过鲁棒控制工具对RC下AGD的收敛区的分析表征。 由于在许多应用中出现了这种优化问题,例如相位检索,神经网络训练和矩阵感测,因此我们的结果显示了这些区域中鲁棒控制理论的承诺。 (c)2019年elestvier有限公司保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号