...
首页> 外文期刊>International Journal of Data Science and Analytics >Large-scale asynchronous distributed learning based on parameter exchanges
【24h】

Large-scale asynchronous distributed learning based on parameter exchanges

机译:基于参数交换的大规模异步分布式学习

获取原文
获取原文并翻译 | 示例

摘要

In many distributed learning problems, the heterogeneous loading of computing machines may harm the overall performance of synchronous strategies, as each machine begins its new computations after receiving an aggregated information from a master and any delay in sending local information to the latter may be a bottleneck. In this paper, we propose an effective asynchronous distributed framework for the minimization of a sum of smooth functions, where each machine performs iterations in parallel on its local function and updates a shared parameter asynchronously. In this way, all machines can continuously work even though they do not have the latest version of the shared parameter. We prove the convergence of the consistency of this general distributed asynchronous method for gradient iterations and then show its efficiency on the matrix factorization problem for recommender systems and on binary classification.
机译:在许多分布式学习问题中,计算机的异构负载可能会损害同步策略的整体性能,因为每台计算机在从主机接收到汇总信息后便开始其新的计算,并且向本地计算机发送本地信息的任何延迟都可能成为瓶颈。 。在本文中,我们提出了一个有效的异步分布式框架,用于最小化平滑函数的总和,其中,每台机器都对其本地函数并行执行迭代,并异步更新共享参数。这样,即使没有共享参数的最新版本,所有机器也可以连续工作。我们证明了该通用分布式异步方法用于梯度迭代的一致性收敛性,然后证明了其在推荐系统的矩阵分解问题和二进制分类上的效率。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号