首页> 外文会议>International Conference on Machine Learning >Natasha: Faster Non-Convex Stochastic Optimization via Strongly Non-Convex Parameter
【24h】

Natasha: Faster Non-Convex Stochastic Optimization via Strongly Non-Convex Parameter

机译:Natasha:通过强不凸参数更快的非凸性随机优化

获取原文

摘要

Given a non-convex function f(x) that is an average of n smooth functions, we design stochastic first-order methods to find its approximate stationary points. The performance of our new methods depend on the smallest (negative) eigen-value -σ of the Hessian. This parameter σ captures how strongly non-convex f(x) is, and is analogous to the strong convexity parameter for convex optimization. At least in theory, our methods outperform known results for a range of parameter σ, and can also be used to find approximate local minima. Our result implies an interesting dichotomy: there exists a threshold σ_0 so that the (currently) fastest methods for σ > σ_0 and for σ < σ_0 have different behaviors: the former scales with n~(2/3) and the latter scales with n~(3/4).
机译:给定非凸起函数f(x)是平均值的平滑功能,我们设计随机的一阶方法,以找到其近似静止点。我们的新方法的性能取决于黑森州的最小(负)eIgen-Value-σ。该参数σ捕获非凸面F(x)的强度强,并且类似于凸优化的强凸性参数。至少在理论上,我们的方法优于一系列参数σ的已知结果,并且也可用于找到近似的局部最小值。我们的结果意味着一个有趣的二分法:存在一个阈值σ_0,使得(当前)σ>Σ_0和σ<Σ_0的最快方法具有不同的行为:前者与n〜(2/3)和后者缩放〜(3/4)。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号