首页> 外文会议>International Conference on Machine Learning >Quadratically Regularized Subgradient Methods for Weakly Convex Optimization with Weakly Convex Constraints
【24h】

Quadratically Regularized Subgradient Methods for Weakly Convex Optimization with Weakly Convex Constraints

机译:具有弱凸的约束的弱凸优化的二次正则化次微征方法

获取原文

摘要

Optimization models with non-convex constraints arise in many tasks in machine learning, e.g., learning with fairness constraints or Neyman-Pearson classification with non-convex loss. Although many efficient methods have been developed with theoretical convergence guarantees for non-convex unconstrained problems, it remains a challenge to design provably efficient algorithms for problems with non-convex functional constraints. This paper proposes a class of subgradient methods for constrained optimization where the objective function and the constraint functions are weakly convex and nonsmooth. Our methods solve a sequence of strongly convex subproblems, where a quadratic regularization term is added to both the objective function and each constraint function. Each subproblem can be solved by various algorithms for strongly convex optimization. Under a uniform Slater's condition, we establish the computation complexities of our methods for finding a nearly stationary point.
机译:具有非凸限制的优化模型在机器学习中的许多任务中出现了许多任务,例如,使用非凸损失的公平限制或Neyman-Pearson分类学习。 虽然已经使用了许多有效的方法,但具有对非凸不受约束问题的理论融合保证,仍然是设计非凸起功能约束问题的可释放有效算法的挑战。 本文提出了一类用于约束优化的子缩放方法,其中目标函数和约束函数是弱凸出的和非光滑的。 我们的方法求解一系列强凸子问题,其中将二次正则化术语添加到目标函数和每个约束函数。 每个子问题可以通过各种算法来解决,用于强凸优化。 在统一的Slater的条件下,我们建立了我们寻找几乎静止点的方法的计算复杂性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号