首页> 外文会议>International Conference on Machine Vision and Image Processing >Class Attention Map Distillation for Efficient Semantic Segmentation
【24h】

Class Attention Map Distillation for Efficient Semantic Segmentation

机译:课堂注意力地图蒸馏有效语义分割

获取原文

摘要

In this paper, a novel method for capturing the information of a powerful and trained deep convolutional neural network and distilling it into a training smaller network is proposed. This is the first time that a saliency map method is employed to extract useful knowledge from a convolutional neural network for distillation. This method, despite of many others which work on final layers, can successfully extract suitable information for distillation from intermediate layers of a network by making class specific attention maps and then forcing the student network to mimic producing those attentions. This novel knowledge distillation training is implemented using state-of-the-art DeepLab and PSPNet segmentation networks and its effectiveness is shown by experiments on the standard Pascal Voc 2012 dataset.
机译:在本文中,提出了一种捕获强大和训练的深度卷积神经网络的信息的新方法,并将其蒸馏到训练较小的网络中。 这是首次采用显着图方法来从卷积神经网络中提取有用的知识以进行蒸馏。 这种方法,尽管在最终层上工作了许多工作,可以通过制作类特定的关注图,从而成功提取从网络的中间层蒸馏的合适信息,然后强制学生网络模仿产生这些关注。 这种新颖的知识蒸馏培训是利用最先进的DEEPLAB和PSPNET分段网络实施的,其有效性由标准Pascal VOC 2012数据集进行实验。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号