首页> 外文期刊>JMLR: Workshop and Conference Proceedings >Katyusha X: Simple Momentum Method for Stochastic Sum-of-Nonconvex Optimization
【24h】

Katyusha X: Simple Momentum Method for Stochastic Sum-of-Nonconvex Optimization

机译:Katyusha X:随机和非凸和优化的简单动量方法

获取原文
           

摘要

The problem of minimizing sum-of-nonconvex functions (i.e., convex functions that are average of non-convex ones) is becoming increasing important in machine learning, and is the core machinery for PCA, SVD, regularized Newton’s method, accelerated non-convex optimization, and more. We show how to provably obtain an accelerated stochastic algorithm for minimizing sum-of-nonconvex functions, by adding one additional line to the well-known SVRG method. This line corresponds to momentum, and shows how to directly apply momentum to the finite-sum stochastic minimization of sum-of-nonconvex functions. As a side result, our method enjoys linear parallel speed-up using mini-batch.
机译:最小化非凸函数之和(即非凸函数的平均值的凸函数)的问题在机器学习中变得越来越重要,它是PCA,SVD,正则牛顿法,加速非凸函数的核心机制优化等等。我们展示了如何通过向著名的SVRG方法添加一条额外的行,以可证明的方式获得一种加速随机算法,以最小化非凸函数之和。这条线对应于动量,并显示了如何将动量直接应用于非凸和函数的有限和随机最小化。附带的结果是,我们的方法使用迷你批处理可实现线性并行加速。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号