首页> 外文会议>International conference on medical imaging computing and computer-assisted intervention >Keep and Learn: Continual Learning by Constraining the Latent Space for Knowledge Preservation in Neural Networks
【24h】

Keep and Learn: Continual Learning by Constraining the Latent Space for Knowledge Preservation in Neural Networks

机译:保持学习:通过限制神经网络中知识保存的潜在空间来进行持续学习

获取原文

摘要

Data is one of the most important factors in machine learning. However, even if we have high-quality data, there is a situation in which access to the data is restricted. For example, access to the medical data from outside is strictly limited due to the privacy issues. In this case, we have to learn a model sequentially only with the data accessible in the corresponding stage. In this work, we propose a new method for preserving learned knowledge by modeling the high-level feature space and the output space to be mutually informative, and constraining feature vectors to lie in the modeled space during training. The proposed method is easy to implement as it can be applied by simply adding a reconstruction loss to an objective function. We evaluate the proposed method on CIFAR-10/100 and a chest X-ray dataset, and show benefits in terms of knowledge preservation compared to previous approaches.
机译:数据是机器学习中最重要的因素之一。但是,即使我们拥有高质量的数据,也存在访问数据受到限制的情况。例如,由于隐私问题,从外部访问医疗数据受到严格限制。在这种情况下,我们只必须顺序学习模型,并在相应阶段访问可访问的数据。在这项工作中,我们提出了一种通过对高级特征空间和输出空间进行建模以使其相互提供信息,并在训练过程中将特征向量约束在建模空间中来保存学习知识的新方法。所提出的方法易于实现,因为可以通过简单地将重建损失添加到目标函数中来应用该方法。我们评估了在CIFAR-10 / 100和胸部X射线数据集上提出的方法,并显示了与以前的方法相比在知识保存方面的优势。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号