首页> 外文会议>AAAI Conference on Artificial Intelligence >Fast Asynchronous Parallel Stochastic Gradient Descent: A Lock-Free Approach with Convergence Guarantee
【24h】

Fast Asynchronous Parallel Stochastic Gradient Descent: A Lock-Free Approach with Convergence Guarantee

机译:快速异步并行随机梯度下降:一种无锁定方法,具有收敛保证

获取原文

摘要

Stochastic gradient descent (SGD) and its variants have become more and more popular in machine learning due to their efficiency and effectiveness. To handle large-scale problems, researchers have recently proposed several parallel SGD methods for multicore systems. However, existing parallel SGD methods cannot achieve satisfactory performance in real applications. In this paper, we propose a fast asynchronous parallel SGD method, called AsySVRG, by designing an asynchronous strategy to parallelize the recently proposed SGD variant called stochastic variance reduced gradient (SVRG). AsySVRG adopts a lock-free strategy which is more efficient than other strategies with locks. Furthermore, we theoretically prove that AsySVRG is convergent with a linear convergence rate. Both theoretical and empirical results show that AsySVRG can outperform existing state-of-the-art parallel SGD methods like Hogwild! in terms of convergence rate and computation cost.
机译:随机梯度下降(SGD)及其变体由于其效率和有效性而在机器学习中变得越来越受欢迎。为了处理大规模问题,研究人员最近提出了多核系统的几种并行SGD方法。但是,现有的并行SGD方法无法在实际应用中实现令人满意的性能。在本文中,我们提出了一种快速异步并行SGD方法,称为ASYSVRG,通过设计异步策略来并行化最近提出的SGD变体,称为随机方差减少梯度(SVRG)。 ASYSVRG采用无锁策略,比其他锁具更有效。此外,我们理论上证明了ASYSVRG以线性收敛速率收敛。理论和经验结果都表明,ASYSVRG可以优于Hogwild等现有的最先进的并行SGD方法!在收敛速率和计算成本方面。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号