首页>
外国专利>
BYZANTINE TOLERANT GRADIENT DESCENT FOR DISTRIBUTED MACHINE LEARNING WITH ADVERSARIES
BYZANTINE TOLERANT GRADIENT DESCENT FOR DISTRIBUTED MACHINE LEARNING WITH ADVERSARIES
展开▼
机译:分布式分布式机器学习的耐拜占庭梯度梯度
展开▼
页面导航
摘要
著录项
相似文献
摘要
The present application concerns a computer-implemented method for training a machine learning model in a distributed fashion, using Stochastic Gradient Descent, SGD, wherein the method is performed by a first computer in a distributed computing environment and comprises performing a learning round, comprising broadcasting a parameter vector to a plurality of worker computers in the distributed computing environment, receiving an estimate update vector (gradient) from all or a subset of the worker computers, wherein each received estimate vector is either an estimate of a gradient of a cost function, or an erroneous vector, and determining an updated parameter vector for use in a next learning round based only on a subset of the received estimate vectors. The method aggregates the gradients while guaranteeing resilience to up to half workers being compromised (malfunctioning, erroneous or modified by attackers).
展开▼