首页> 外文期刊>Neurocomputing >Distributed and asynchronous Stochastic Gradient Descent with variance reduction
【24h】

Distributed and asynchronous Stochastic Gradient Descent with variance reduction

机译:具有减少方差的分布式和异步随机梯度下降

获取原文
获取原文并翻译 | 示例
       

摘要

Stochastic Gradient Descent (SGD) with variance reduction techniques has been proved powerful to train the parameters of various machine learning models. However, it cannot support the distributed systems trivially due to the intrinsic design. Although conventional studies such as PetuumSGD perform well for distributed machine learning tasks, they mainly focus on the optimization of the communication protocol, which does not exploit the potential benefits of a specific machine learning algorithm. In this paper, we analyze the asynchronous communication protocol in PetuumSGD, and propose a distributed version of variance reduced SGD named DisSVRG. DisSVRG adopts the variance reduction technique to update the parameters in a model. After that, those newly learned parameters across nodes are shared by using the asynchronous communication protocol. Besides, we accelerate DisSVRG by using the adaptive learning rate with an acceleration factor. Additionally, an adaptive sampling strategy is proposed in DisSVRG. The proposed methods greatly reduce the wait time during the iterations, and accelerate the convergence of DisSVRG significantly. Extensive empirical studies verify that DisSVRG converges faster than the state-of-the-art variants of SGD, and gains almost linear speedup in a cluster. (c) 2017 Elsevier B.V. All rights reserved.
机译:事实证明,采用方差减少技术的随机梯度下降(SGD)可以有效训练各种机器学习模型的参数。但是,由于其固有的设计,它无法轻松支持分布式系统。尽管诸如PetuumSGD之类的常规研究在分布式机器学习任务中表现良好,但它们主要集中在通信协议的优化上,而没有充分利用特定机器学习算法的潜在优势。在本文中,我们分析了PetuumSGD中的异步通信协议,并提出了分布式版本的方差缩减SGD,称为DisSVRG。 DisSVRG采用方差减少技术来更新模型中的参数。之后,使用异步通信协议在节点之间共享那些新学习的参数。此外,我们通过使用具有加速因子的自适应学习率来加速DisSVRG。此外,在DisSVRG中提出了一种自适应采样策略。所提出的方法大大减少了迭代过程中的等待时间,并大大加快了DisSVRG的收敛速度。大量的经验研究证明,DisSVRG的收敛速度比SGD的最新变体快,并且在集群中获得了几乎线性的加速。 (c)2017 Elsevier B.V.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号