首页> 外文会议>European Signal Processing Conference >Learning without Forgetting for Decentralized Neural Nets with Low Communication Overhead
【24h】

Learning without Forgetting for Decentralized Neural Nets with Low Communication Overhead

机译:学习而不忘记具有低通信开销的分散的神经网

获取原文

摘要

We consider the problem of training a neural net over a decentralized scenario with a low communication over-head. The problem is addressed by adapting a recently proposed incremental learning approach, called ‘learning without forgetting’. While an incremental learning approach assumes data availability in a sequence, nodes of the decentralized scenario can not share data between them and there is no master node. Nodes can communicate information about model parameters among neighbors. Communication of model parameters is the key to adapt the ‘learning without forgetting’ approach to the decentralized scenario. We use random walk based communication to handle a highly limited communication resource.
机译:我们认为,通过低通信的分散场景训练神经网络的问题。通过调整最近提出的增量学习方法,称为“学习而不忘记”来解决问题。虽然增量学习方法在序列中假设数据可用性时,分散方案的节点无法在它们之间共享数据,并且没有主节点。节点可以在邻居之间传送有关模型参数的信息。模型参数的通信是适应“学习而不忘记”方法的关键。我们使用基于随机的步行通信来处理高度限制的通信资源。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号