【24h】

Importance Driven Continual Learning for Segmentation Across Domains

机译:重要性驱动跨领域的分割的连续学习

获取原文

摘要

The ability of neural networks to continuously learn and adapt to new tasks while retaining prior knowledge is crucial for many applications. However, current neural networks tend to forget previously learned tasks when trained on new ones, i.e., they suffer from Catastrophic Forgetting (CF). The objective of Continual Learning (CL) is to alleviate this problem, which is particularly relevant for medical applications, where it may not be feasible to store and access previously used sensitive patient data. In this work, we propose a Continual Learning approach for brain segmentation, where a single network is consecutively trained on samples from different domains. We build upon an importance driven approach and adapt it for medical image segmentation. Particularly, we introduce a learning rate regularization to prevent the loss of the network's knowledge. Our results demonstrate that directly restricting the adaptation of important network parameters clearly reduces Catastrophic Forgetting for segmentation across domains.
机译:神经网络在留住先前知识的同时连续学习和适应新任务的能力对于许多应用来说至关重要。然而,当前的神经网络倾向于在新的神经网络上忘记先前学习的任务,即,他们遭受灾难性的遗忘(CF)。持续学习(CL)的目的是缓解这一问题,这对于医学应用特别相关,在那里存储和访问先前使用的敏感患者数据可能是不可行的。在这项工作中,我们提出了脑细分的持续学习方法,其中单个网络在不同域的样本上进行了连续培训。我们根据一个重要的驱动方法构建并适应医学图像分割。特别是,我们介绍了一个学习速率正常化,以防止丢失网络的知识。我们的结果表明,直接限制了重要的网络参数的适应明显减少了跨域分割的灾难性忘记。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号