首页> 外文会议>IEEE Winter Conference on Applications of Computer Vision >Do not Forget to Attend to Uncertainty while Mitigating Catastrophic Forgetting
【24h】

Do not Forget to Attend to Uncertainty while Mitigating Catastrophic Forgetting

机译:不要忘记在减轻灾难性的遗忘时出席不确定性

获取原文

摘要

One of the major limitations of deep learning models is that they face catastrophic forgetting in an incremental learning scenario. There have been several approaches proposed to tackle the problem of incremental learning. Most of these methods are based on knowledge distillation and do not adequately utilize the information provided by older task models, such as uncertainty estimation in pre-dictions. The predictive uncertainty provides the distributional information can be applied to mitigate catastrophic forgetting in a deep learning framework. In the proposed work, we consider a Bayesian formulation to obtain the data and model uncertainties. We also incorporate self-attention framework to address the incremental learning problem. We define distillation losses in terms of aleatoric uncertainty and self-attention. In the proposed work, we investigate different ablation analyses on these losses. Furthermore, we are able to obtain better results in terms of accuracy on standard benchmarks.
机译:深度学习模型的主要限制之一是他们在增量学习场景中面临灾难性的遗忘。提出了几种方法来解决增量学习的问题。这些方法中的大多数是基于知识蒸馏,并且不充分利用旧的任务模型提供的信息,例如预测前的不确定性估计。预测性不确定性提供分配信息可以应用于缓解深度学习框架中的灾难性遗忘。在拟议的工作中,我们考虑贝叶斯配方以获得数据和模型不确定性。我们还包含自我关注框架来解决增量学习问题。我们在炼液不确定性和自我关注方面定义蒸馏损失。在拟议的工作中,我们调查了对这些损失的不同消融分析。此外,我们能够在标准基准的准确性方面获得更好的结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号