首页> 外文会议>International Conference on Cloud Computing, Data Science and Engineering >Heralding the Future of Federated Learning Framework: Architecture, Tools and Future Directions
【24h】

Heralding the Future of Federated Learning Framework: Architecture, Tools and Future Directions

机译:覆盖联邦学习框架的未来:建筑,工具和未来方向

获取原文

摘要

In today’s era, the exponential growth of data and its management is a matter of concern. Machine learning has shown its efficacy in multiple application areas. But machine learning on decentralized data was a hectic task since last decade. A novel technology has gained much importance in recent days i.e., federated learning which deals with training on decentralized and distributed data along with preservation of its privacy. Smartphone data being privacy-sensitive is used for locally training a global model which further is aggregated to generate an updated global model which again is distributed among multiple clients. This paper focuses on presenting the efficacy of federated learning by epitomizing an architecture showing the working mechanism of the technology. Further, this paper exhibits an intersection of on-device machine learning, privacy preservation technology and edge computing i.e., federated learning. Also, we have used TensorFlow Federated, an open source platform to simulate federated learning tasks for MNIST and extended MNIST (E-MNIST) datasets. Further, the results contain the loss and accuracy parameters for ten iterations repeated for six optimizer states (Optst) for each dataset. The peak accuracy that we achieved for MNIST and E-MNIST datasets are 0.843 and 0.853 respectively by using federated averaging algorithm. Further, the minimum loss value that we obtained for MNIST and E-MNIST datasets are 0.652 and 0.646 respectively. The execution time for implementing the algorithm for each dataset is presented in a graphical manner. Finally, certain application areas where federated learning technology has aided are scrutinized.
机译:在今天的时代,数据的指数增长及其管理是一个关注的问题。机器学习在多个应用领域显示了其效力。但是,自上年以来,机器学习分散数据是一个忙碌的任务。最近几天,一项新颖的技术越来越重要,即联合会学习,该学习涉及分散和分布式数据的培训以及保存其隐私。智能手机数据是隐私敏感性用于本地培训一个全局模型,进一步聚合以生成更新的全局模型,该模型再次分布在多个客户端之间。本文重点介绍,通过展示展示技术的工作机制来展示联邦学习的功效。此外,本文展现了设备上的设备学习,隐私保存技术和边缘计算等。联合学习。此外,我们已经使用了Tensorflow联合,一个开源平台来模拟Mnist和扩展Mnist(E-Mnist)数据集的联合学习任务。此外,结果包含用于六个优化器状态的十个迭代的损耗和精度参数(选择 st )对于每个数据集。通过使用联合平均算法分别为Mnist和E-Mnist数据集实现的峰值精度分别为0.843和0.853。此外,我们为Mnist和E-Mnist数据集获得的最小损耗值分别为0.652和0.646。以图形方式呈现用于实现每个数据集的算法的执行时间。最后,审查联合学习技术的某些应用领域被审查。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号