首页> 外文期刊>IEEE Signal Processing Magazine >Asymptotic Network Independence in Distributed Stochastic Optimization for Machine Learning: Examining Distributed and Centralized Stochastic Gradient Descent
【24h】

Asymptotic Network Independence in Distributed Stochastic Optimization for Machine Learning: Examining Distributed and Centralized Stochastic Gradient Descent

机译:机器学习分布式随机优化中的渐近网络独立性:检查分布式和集中式随机梯度下降

获取原文
获取原文并翻译 | 示例
       

摘要

We provide a discussion of several recent results which, in certain scenarios, are able to overcome a barrier in distributed stochastic optimization for machine learning (ML). Our focus is the so-called asymptotic network independence property, which is achieved whenever a distributed method executed over a network of n nodes asymptotically converges to the optimal solution at a comparable rate to a centralized method with the same computational power as the entire network. We explain this property through an example involving the training of ML models and sketch a short mathematical analysis for comparing the performance of distributed stochastic gradient descent (DSGD) with centralized SGD.
机译:我们提供了几个最近结果的讨论,在某些情况下,能够克服机器学习(ML)的分布式随机优化的屏障。我们的重点是所谓的渐近网络独立性,每当在N个节点的网络上执行的分布式方法以相当的速率以相当的速率收敛到具有与整个网络相同的计算功率的集中方法,可以实现。我们通过涉及ML模型的培训的示例来解释这一性质,并将简短的数学分析进行比较,以比较具有集中式SGD的分布式随机梯度下降(DSGD)的性能。

著录项

  • 来源
    《IEEE Signal Processing Magazine》 |2020年第3期|114-122|共9页
  • 作者单位

    Chinese Univ Hong Kong Inst Data & Decis Analyt Shenzhen Peoples R China|Univ Florida Gainesville FL USA|Arizona State Univ Tempe AZ USA|Boston Univ Boston MA 02215 USA;

    Boston Univ Dept Elect & Comp Engn Boston MA 02215 USA;

    Boston Univ Boston MA 02215 USA|Ctr Informat & Syst Engn Boston MA USA;

  • 收录信息
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号