首页> 外文会议>IEEE International Conference on Computer Vision >Incremental Learning of Object Detectors without Catastrophic Forgetting
【24h】

Incremental Learning of Object Detectors without Catastrophic Forgetting

机译:对象探测器的增量学习,没有灾难性的遗忘

获取原文

摘要

Despite their success for object detection, convolutional neural networks are ill-equipped for incremental learning, i.e., adapting the original model trained on a set of classes to additionally detect objects of new classes, in the absence of the initial training data. They suffer from "catastrophic forgetting "- an abrupt degradation of performance on the original set of classes, when the training objective is adapted to the new classes. We present a method to address this issue, and learn object detectors incrementally, when neither the original training data nor annotations for the original classes in the new training set are available. The core of our proposed solution is a loss function to balance the interplay between predictions on the new classes and a new distillation loss which minimizes the discrepancy between responses for old classes from the original and the updated networks. This incremental learning can be performed multiple times, for a new set of classes in each step, with a moderate drop in performance compared to the baseline network trained on the ensemble of data. We present object detection results on the PASCAL VOC 2007 and COCO datasets, along with a detailed empirical analysis of the approach.
机译:尽管他们的目标检测成功,卷积神经网络是装备不良的增量学习,即,适应训练的一组类还发现新的类的对象,在没有初始训练数据的原始模型。对原有的组类性能的急剧下降,当训练目的是适应新课程 - 他们从“灾难性的遗忘”遭殃。我们提出了一个方法来逐步解决这个问题,并学习对象检测器,当既没有原来的训练数据,也不是原来的班级在新的训练组注解是可用的。我们提出的解决方案的核心是一个损失函数来平衡新的类预测,从原来的老班响应和更新网络之间的差距最小的新蒸馏损失之间的相互作用。这个增量学习,可进行多个次,在每个步骤中一组新的类,而其性能中等下降相比上训练数据的集合的基线网络。我们在PASCAL VOC 2007年和COCO目前的数据集对象检测结果,与方法的详细实证分析沿。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号