首页> 外文期刊>JMLR: Workshop and Conference Proceedings >High probability guarantees for stochastic convex optimization
【24h】

High probability guarantees for stochastic convex optimization

机译:随机凸优化的高概率保证

获取原文
           

摘要

Standard results in stochastic convex optimization bound the number of samples that an algorithm needs to generate a point with small function value in expectation. More nuanced high probability guarantees are rare, and typically either rely on “light-tail” noise assumptions or exhibit worse sample complexity. In this work, we show that a wide class of stochastic optimization algorithms for strongly convex problems can be augmented with high confidence bounds at an overhead cost that is only logarithmic in the confidence level and polylogarithmic in the condition number. The procedure we propose, called proxBoost, is elementary and builds on two well-known ingredients: robust distance estimation and the proximal point method. We discuss consequences for both streaming (online) algorithms and offline algorithms based on empirical risk minimization.
机译:随机凸优化的标准结果绑定了算法需要在期望中产生小功能值的点的样本数。更细微的高概率保证是罕见的,通常依赖于“浅尾”噪声假设或表现出更差的样本复杂性。在这项工作中,我们表明,对于强凸面问题的广泛随机优化算法可以以高度置信范围增强,其占地成本仅是在条件号中的置信水平和积极转移中的对数。我们提出的程序称为Proxboost,是基本的,并在两个众所周知的成分上构建:鲁棒距离估计和近端点法。我们根据经验风险最小化讨论流媒体(在线)算法和离线算法的后果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号