...
首页> 外文期刊>SIAM Journal on Optimization: A Publication of the Society for Industrial and Applied Mathematics >A LINEARLY CONVERGENT VARIANT OF THE CONDITIONAL GRADIENT ALGORITHM UNDER STRONG CONVEXITY, WITH APPLICATIONS TO ONLINE AND STOCHASTIC OPTIMIZATION
【24h】

A LINEARLY CONVERGENT VARIANT OF THE CONDITIONAL GRADIENT ALGORITHM UNDER STRONG CONVEXITY, WITH APPLICATIONS TO ONLINE AND STOCHASTIC OPTIMIZATION

机译:强凸条件下条件梯度算法的线性收敛变型及其在在线和随机优化中的应用

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

Linear optimization is many times algorithmically simpler than nonlinear convex optimization. Linear optimization over matroid polytopes, matching polytopes, and path polytopes are examples of problems for which we have simple and efficient combinatorial algorithms but whose nonlinear convex counterpart is harder and admits significantly less efficient algorithms. This motivates the computational model of convex optimization, including the offline, online, and stochastic settings, using a linear optimization oracle. In this computational model we give several new results that improve on the previous state of the art. Our main result is a novel conditional gradient algorithm for smooth and strongly convex optimization over polyhedral sets that performs only a single linear optimization step over the domain on each iteration and enjoys a linear convergence rate. This gives an exponential improvement in convergence rate over previous results. Based on this new conditional gradient algorithm we give the first algorithms for online convex optimization over polyhedral sets that perform only a single linear optimization step over the domain while having optimal regret guarantees, answering an open question of Kalai and Vempala and of Hazan and Kale. Our online algorithms also imply conditional gradient algorithms for nonsmooth and stochastic convex optimization with the same convergence rates as projected (sub)gradient methods.
机译:线性优化在算法上比非线性凸优化要简单很多。拟阵多面体,匹配多面体和路径多面体的线性优化是问题的示例,对于这些问题,我们拥有简单而有效的组合算法,但非线性凸形对应项更难并且允许效率大大降低。这会使用线性优化预言机激发凸优化的计算模型,包括离线,在线和随机设置。在此计算模型中,我们给出了一些新的结果,这些结果在现有技术的基础上有所改进。我们的主要结果是一种新颖的条件梯度算法,用于多面体集的平滑和强凸优化,该算法在每次迭代中仅对域执行一个线性优化步骤,并享有线性收敛速度。与以前的结果相比,这使收敛速度呈指数增长。基于这种新的条件梯度算法,我们给出了多面体集合在线凸优化的第一个算法,该算法仅在域上执行单个线性优化步骤,同时具有最佳后悔保证,回答了Kalai和Vempala以及Hazan和Kale的公开问题。我们的在线算法还暗示了用于非平滑和随机凸优化的条件梯度算法,其收敛速度与投影(子)梯度方法相同。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号