...
首页> 外文期刊>Neural Networks and Learning Systems, IEEE Transactions on >Stochastic Conjugate Gradient Algorithm With Variance Reduction
【24h】

Stochastic Conjugate Gradient Algorithm With Variance Reduction

机译:具有方差减少的随机共轭梯度算法

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

Conjugate gradient (CG) methods are a class of important methods for solving linear equations and nonlinear optimization problems. In this paper, we propose a new stochastic CG algorithm with variance reduction(1) and we prove its linear convergence with the Fletcher and Reeves method for strongly convex and smooth functions. We experimentally demonstrate that the CG with variance reduction algorithm converges faster than its counterparts for four learning models, which may be convex, nonconvex or nonsmooth. In addition, its area under the curve performance on six large-scale data sets is comparable to that of the LIBLINEAR solver for the L2-regularized L2-loss but with a significant improvement in computational efficiency.
机译:共轭梯度(CG)方法是求解线性方程和非线性优化问题的一类重要方法。在本文中,我们提出了一种具有差异减少(1)的新的随机CG算法,我们将其线性收敛与闪光灯和REEVES方法进行了强烈凸面和平滑功能。我们通过实验证明具有方差减少算法的CG比四个学习模型的对应物收敛得更快,其可以是凸,非凸起或非运动。此外,其在六个大规模数据集的曲线性能下的区域与L2 - 正则化L2损耗的Liblinear求解器的区域相当,但在计算效率的显着提高。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号