首页> 外国专利> BYZANTINE TOLERANT GRADIENT DESCENT FOR DISTRIBUTED MACHINE LEARNING WITH ADVERSARIES

BYZANTINE TOLERANT GRADIENT DESCENT FOR DISTRIBUTED MACHINE LEARNING WITH ADVERSARIES

机译:分布式分布式机器学习的耐拜占庭梯度梯度

摘要

The present application concerns a computer-implemented method for training a machine learning model in a distributed fashion, using Stochastic Gradient Descent, SGD, wherein the method is performed by a first computer in a distributed computing environment and comprises performing a learning round, comprising broadcasting a parameter vector to a plurality of worker computers in the distributed computing environment, receiving an estimate update vector (gradient) from all or a subset of the worker computers, wherein each received estimate vector is either an estimate of a gradient of a cost function, or an erroneous vector, and determining an updated parameter vector for use in a next learning round based only on a subset of the received estimate vectors. The method aggregates the gradients while guaranteeing resilience to up to half workers being compromised (malfunctioning, erroneous or modified by attackers).
机译:本申请涉及一种用于使用随机梯度下降SGD来以分布式方式训练机器学习模型的计算机实现的方法,其中该方法由第一台计算机在分布式计算环境中执行,并且包括执行学习回合,包括广播分布式计算环境中多个工作计算机的参数向量,从所有或部分工作计算机接收估计更新向量(梯度),其中每个接收到的估计向量是成本函数梯度的估计,或错误的向量,并仅根据接收到的估计向量的子集确定更新的参数向量以用于下一轮学习。该方法汇总了梯度,同时保证了最多半数受害(攻击者的功能失常,错误或修改)的弹性。

著录项

相似文献

  • 专利
  • 外文文献
  • 中文文献
获取专利

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号