首页> 外文会议>Conference on Neural Information Processing Systems >Efficient Smooth Non-Convex Stochastic Compositional Optimization via Stochastic Recursive Gradient Descent
【24h】

Efficient Smooth Non-Convex Stochastic Compositional Optimization via Stochastic Recursive Gradient Descent

机译:通过随机递归梯度下降高效平滑的非凸性随机成分优化

获取原文
获取外文期刊封面目录资料

摘要

Stochastic compositional optimization arises in many important machine learning applications. The objective function is the composition of two expectations of stochastic functions, and is more challenging to optimize than vanilla stochastic optimization problems. In this paper, we investigate the stochastic compositional optimization in the general smooth non-convex setting. We employ a recently developed idea of Stochastic Recursive Gradient Descent to design a novel algorithm named SARAH-Compositional, and prove a sharp Incremental First-order Oracle (IFO) complexity upper bound for stochastic compositional optimization: O((n + m)~(1/2)ε~(-2)) in the finite-sum case and O(ε~(-3)) in the online case. Such a complexity is known to be the best one among IFO complexity results for non-convex stochastic compositional optimization. Numerical experiments on risk-adverse portfolio management validate the superiority of SARAH-Compositional over a few rival algorithms.
机译:随机组成优化出现在许多重要的机器学习应用中。 目标函数是随机功能的两个期望的组成,并且比Vanilla随机优化问题更具挑战性。 在本文中,我们研究了一般光滑的非凸设置中的随机组成优化。 我们使用最近开发的随机递归梯度下降思想,设计一种名为Sarah-Compositional的新型算法,并证明随机组成优化的尖锐增量一阶Oracle(IFO)复杂性上限:O((n + m)〜( 1/2)ε〜(-2))在在线案例中的有限和案例和O(ε〜(-3))中。 已知这种复杂性是非凸性随机成分优化的IFO复杂性结果中最好的复杂性。 风险不利投资组合管理的数值实验验证了少数竞争对手算法的莎拉组成的优越性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号