...
首页> 外文期刊>Neural computing & applications >Accelerating SGD using flexible variance reduction on large-scale datasets
【24h】

Accelerating SGD using flexible variance reduction on large-scale datasets

机译:在大型数据集上使用灵活方差减少加速SGD

获取原文
获取原文并翻译 | 示例
           

摘要

Stochastic gradient descent (SGD) is a popular optimization method widely used in machine learning, while the variance of gradient estimation leads to slow convergence. To accelerate the speed, many variance reduction methods have been proposed. However, most of these methods require additional memory cost or computational burden on full gradient, which results in low efficiency or even unavailable while applied to real-world applications with large-scale datasets. To handle this issue, we propose a new flexible variance reduction method for SGD, named FVR-SGD, which can reduce memory overhead and speed up the convergence using flexible subset without extra computation. We analyze the details of convergence property for our method, and linear convergence rate can be guaranteed while using flexible variance reduction. Some efficient variants for distributed environment and deep neural networks are also proposed in this paper. Several numerical experiments are conducted on a genre of real-world large-scale datasets. The experimental results demonstrated that FVR-SGD outperforms the currently popular algorithms. Specifically, the proposed method can achieve up to 40% reduction in the training time to solve the optimization problem of logistic regression, SVM and neural networks.
机译:随机梯度下降(SGD)是一种广泛应用于机器学习的流行优化方法,而梯度估计的变化导致缓慢收敛。为了加速速度,已经提出了许多方差减少方法。然而,这些方法中的大多数都需要额外的内存成本或完全梯度计算负担,这导致低效率甚至不可用,同时应用于具有大规模数据集的真实应用程序。为了处理这个问题,我们为SGD提出了一种新的灵活性差异,命名为FVR-SGD,可以减少内存开销,并使用灵活子集加快收敛而无需额外计算。我们分析了我们的方法的收敛性的细节,并且在使用灵活方差减少时可以保证线性收敛速率。本文还提出了一些用于分布式环境和深神经网络的有效变体。在现实世界大规模数据集的类型上进行了几个数值实验。实验结果表明,FVR-SGD优于目前流行的算法。具体地,所提出的方法可以在训练时间降低高达40%以解决逻辑回归,SVM和神经网络的优化问题。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号