首页> 外文会议>Information Theory and Applications Workshop >Communication-Efficient and Byzantine-Robust Distributed Learning
【24h】

Communication-Efficient and Byzantine-Robust Distributed Learning

机译:高效沟通和拜占庭式鲁棒分布式学习

获取原文

摘要

We develop a communication-efficient distributed learning algorithm that is robust against Byzantine worker machines. We propose and analyze a distributed gradient-descent algorithm that performs a simple thresholding based on gradient norms to mitigate Byzantine failures. We show the (statistical) error-rate of our algorithm matches that of [YCKB18], which uses more complicated schemes (like coordinate-wise median or trimmed mean) and thus optimal. Furthermore, for communication efficiency, we consider a generic class of δ-approximate compressors from [KRSJ19] that encompasses sign-based compressors and top-k sparsification. Our algorithm uses compressed gradients and gradient norms for aggregation and Byzantine removal respectively. We establish the statistical error rate of the algorithm for arbitrary (convex or non-convex) smooth loss function. We show that, in the regime when the compression factor δ is constant and the dimension of the parameter space is fixed, the rate of convergence is not affected by the compression operation, and hence we effectively get the compression for free. Moreover, we extend the compressed gradient descent algorithm with error feedback proposed in [KRSJ19] for the distributed setting. We have experimentally validated our results and shown good performance in convergence for convex (least-square regression) and non-convex (neural network training) problems.
机译:我们开发了一种通信效率高的分布式学习算法,该算法对拜占庭工人机器具有鲁棒性。我们提出并分析了一种分布式梯度下降算法,该算法基于梯度范本执行简单的阈值处理以减轻拜占庭式故障。我们显示,我们算法的(统计)错误率与[YCKB18]的错误率相匹配,后者使用更复杂的方案(例如,按坐标中位数或修正的均值),因此是最佳的。此外,为了提高通信效率,我们考虑了[KRSJ19]中的一类δ近似压缩器,该类包括基于符号的压缩器和top-k稀疏化。我们的算法分别使用压缩梯度和梯度范数进行聚集和去除拜占庭。我们为任意(凸或非凸)平滑损失函数建立算法的统计错误率。我们表明,在压缩因子δ恒定且参数空间尺寸固定的情况下,收敛速度不受压缩操作的影响,因此可以有效地获得免费压缩。此外,我们在[KRSJ19]中针对分布式设置扩展了具有误差反馈的压缩梯度下降算法。我们已经通过实验验证了我们的结果,并在凸(最小二乘回归)和非凸(神经网络训练)问题的收敛性方面显示了良好的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号