【24h】

Gossip Learning as a Decentralized Alternative to Federated Learning

机译:八卦学习作为联合学习的分散式替代

获取原文

摘要

Federated learning is a distributed machine learning approach for computing models over data collected by edge devices. Most importantly, the data itself is not collected centrally, but a master-worker architecture is applied where a master node performs aggregation and the edge devices are the workers, not unlike the parameter server approach. Gossip learning also assumes that the data remains at the edge devices, but it requires no aggregation server or any central component. In this empirical study, we present a thorough comparison of the two approaches. We examine the aggregated cost of machine learning in both cases, considering also a compression technique applicable in both approaches. We apply a real churn trace as well collected over mobile phones, and we also experiment with different distributions of the training data over the devices. Surprisingly, gossip learning actually outperforms federated learning in all the scenarios where the training data are distributed uniformly over the nodes, and it performs comparably to federated learning overall.
机译:联合学习是一种分布式机器学习方法,用于根据边缘设备收集的数据计算模型。最重要的是,数据本身不是集中收集的,而是采用了主工作机体系结构,其中主节点执行聚合并且边缘设备是工作人员,这与参数服务器方法不同。八卦学习还假设数据保留在边缘设备上,但是不需要聚合服务器或任何中央组件。在这项实证研究中,我们对两种方法进行了全面比较。我们在两种情况下都研究了机器学习的总成本,同时还考虑了适用于两种方法的压缩技术。我们还应用了通过手机收集的真实客户流失轨迹,并且还尝试了在设备上训练数据的不同分布。令人惊讶的是,八卦学习实际上在所有训练数据均匀分布在节点上的情况下都胜过联合学习,并且在整体上与联合学习的表现相当。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号