首页> 中文期刊> 《软件学报》 >一种求解强凸优化问题的最优随机算法

一种求解强凸优化问题的最优随机算法

         

摘要

Stochastic gradient descent (SGD) is one of the efficient methods for dealing with large-scale data. Recent research shows that the black-box SGD method can reach an O(1/T) convergence rate for strongly-convex problems. However, for solving the regularized problem with L1 plus L2 terms, the convergence rate of the structural optimization method such as COMID (composite objective mirror descent) can only attain O(lnT/T). In this paper, a weighted algorithm based on COMID is presented, to keep the sparsity imposed by the L1 regularization term. A prove is provided to show that it achieves an O(1/T) convergence rate. Furthermore, the proposed scheme takes the advantage of computation on-the-fly so that the computational costs are reduced. The experimental results demonstrate the correctness of theoretic analysis and effectiveness of the proposed algorithm.%随机梯度下降(SGD)算法是处理大规模数据的有效方法之一。黑箱方法SGD在强凸条件下能达到最优的O(1/T)收敛速率,但对于求解L1+L2正则化学习问题的结构优化算法,如COMID(composite objective mirror descent)仅具有O(lnT/T)的收敛速率。提出一种能够保证稀疏性基于COMID的加权算法,证明了其不仅具有O(1/T)的收敛速率,还具有on-the-fly计算的优点,从而减少了计算代价。实验结果表明了理论分析的正确性和所提算法的有效性。

著录项

相似文献

  • 中文文献
  • 外文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号