首页> 外文会议>IEEE International Conference on Acoustics, Speech and Signal Processing >Perturbed Projected Gradient Descent Converges to Approximate Second-order Points for Bound Constrained Nonconvex Problems
【24h】

Perturbed Projected Gradient Descent Converges to Approximate Second-order Points for Bound Constrained Nonconvex Problems

机译:扰动的投影梯度下降将收敛到近似限制的非耦合问题的近似二阶点

获取原文

摘要

In this paper, a gradient-based method for bound constrained non-convex problems is proposed. By leveraging both projected gradient descent and perturbed gradient descent, the proposed algorithm, named perturbed projected gradient descent (PP-GD), converges to some approximate second-order stationary (SS2) points (which satisfy certain approximate second-order necessary conditions) with provable convergence rate guarantees. The proposed algorithm is suitable for a large-scale problem since it only uses the gradient information of the objective function. It also seamlessly incorporates variable constraints such as nonnegativity, which is commonly seen in many practical machine learning problems. We provide a concrete theoretical analysis showing that PP-GD is able to obtain approximate second-order solutions by extracting the negative curvature of the objective function around the strict saddle points. Numerical results demonstrate that PP-GD indeed converges faster compared to other first-order methods in the presence of strict saddle points.
机译:在本文中,提出了一种基于梯度的受约束非凸面问题的方法。通过利用预计梯度下降和扰动梯度下降,所提出的算法,命名扰动投影梯度下降(PP-GD),收敛于一些近似二阶固定(SS2)点(其满足某些近似二阶必要条件)可提供的收敛速度保证。所提出的算法适用于大规模的问题,因为它只使用目标函数的梯度信息。它还无缝地包含可变约束,例如非室代码,这在许多实际机器学习问题中通常看出。我们提供了一个具体的理论分析,表明PP-GD能够通过提取严格鞍点周围的物镜函数的负曲率来获得近似二阶解决方案。数值结果表明,与存在严格鞍点存在的其他一阶方法相比,PP-GD确实收敛了更快。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号