首页> 外文期刊>JMLR: Workshop and Conference Proceedings >Finding Global Optima in Nonconvex Stochastic Semidefinite Optimization with Variance Reduction
【24h】

Finding Global Optima in Nonconvex Stochastic Semidefinite Optimization with Variance Reduction

机译:在方差减少的非凸随机半定优化中寻找全局最优

获取原文
           

摘要

There is a recent surge of interest in nonconvex reformulations via low-rank factorization for stochastic convex semidefinite optimization problem in the purpose of efficiency and scalability. Compared with the original convex formulations, the nonconvex ones typically involve much fewer variables, allowing them to scale to scenarios with millions of variables. However, it opens a new challenge that under what conditions the nonconvex stochastic algorithms may find the global optima effectively despite their empirical success in applications. In this paper, we provide an answer that a stochastic gradient descent method with variance reduction, can be adapted to solve the nonconvex reformulation of the original convex problem, with a global linear convergence, i.e., converging to a global optimum exponentially fast, at a proper initial choice in the restricted strongly convex case. Experimental studies on both simulation and real-world applications on ordinal embedding are provided to show the effectiveness of the proposed algorithms.
机译:出于效率和可伸缩性的目的,最近对于通过随机凸半定优化问题通过低秩分解进行的非凸重新构造引起了人们的兴趣。与原始的凸公式相比,非凸公式通常包含较少的变量,从而使它们可以扩展到具有数百万个变量的方案。然而,这带来了一个新的挑战,即非凸型随机算法在实际应用中取得了成功的经验,但在什么条件下非凸型随机算法仍可以有效地找到全局最优值。在本文中,我们提供了一个答案,即具有方差减少的随机梯度下降方法可以适用于解决原始凸问题的非凸重新构造,并且具有全局线性收敛,即在a处快速指数级收敛到全局最优。在受限强凸情况下的正确初始选择。提供了关于顺序嵌入的仿真和实际应用的实验研究,以证明所提出算法的有效性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号