...
首页> 外文期刊>IEEE communications letters >Coded Decentralized Learning With Gradient Descent for Big Data Analytics
【24h】

Coded Decentralized Learning With Gradient Descent for Big Data Analytics

机译:编码分散学习,具有大数据分析的梯度下降

获取原文
获取原文并翻译 | 示例
           

摘要

Machine learning is an effective technique for big data analytics. We focus on the study of big data analytics with decentralized learning in large-scale networks. Fountain codes are applied to the decentralized learning process to reduce communication load for exchanging intermediate learning parameters among fog nodes. Two scenarios, i.e., disjoint datasets and overlapping datasets, are analyzed. Comparison results show that communication load can be reduced significantly by the Fountain-based scheme for large-scale networks, especially when the quality of communication links is relatively bad and/or the number of fog nodes is large.
机译:机器学习是大数据分析的有效技术。我们专注于大规模网络中分散学习的大数据分析研究。喷泉代码应用于分散的学习过程,以减少用于在雾节点中交换中间学习参数的通信负载。分析了两个场景,即不相交的数据集和重叠数据集。比较结果表明,通过基于喷泉的方案对于大型网络的基于喷泉的方案,可以显着减少通信负载,尤其是当通信链路的质量相对较差并且/或雾节点的数量大。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号