首页> 外文会议>Asian Conference on Computer Vision >Gated Transfer Network for Transfer Learning
【24h】

Gated Transfer Network for Transfer Learning

机译:门控转移网络,用于转移学习

获取原文

摘要

Deep neural networks have led to a series of breakthroughs in computer vision given sufficient annotated training datasets. For novel tasks with limited labeled data, the prevalent approach is to transfer the knowledge learned in the pre-trained models to the new tasks by fine-tuning. Classic model fine-tuning utilizes the fact that well trained neural networks appear to learn cross domain features. These features are treated equally during transfer learning. In this paper, we explore the impact of feature selection in model fine-tuning by introducing a transfer module, which assigns weights to features extracted from pre-trained models. The proposed transfer module proves the importance of feature selection for transferring models from source to target domains. It is shown to significantly improve upon fine-tuning results with only marginal extra computational cost. We also incorporate an auxiliary classifier as an extra regularizer to avoid over-fitting. Finally, we build a Gated Transfer Network (GTN) based on our transfer module and achieve state-of-the-art results on six different tasks.
机译:有了足够的带注释的训练数据集,深度神经网络已导致计算机视觉领域的一系列突破。对于带有有限标签数据的新颖任务,普遍的方法是通过微调将在预训练模型中学习的知识转移到新任务上。经典模型微调利用了受过良好训练的神经网络似乎在学习跨域特征这一事实。在迁移学习期间,这些功能将得到同等对待。在本文中,我们通过引入传输模块来探索特征选择在模型微调中的影响,该模块将权重分配给从预训练模型中提取的特征。提出的传输模块证明了特征选择对于将模型从源域传输到目标域的重要性。它显示了微调结果的显着改善,仅需少量的额外计算成本。我们还结合了辅助分类器作为额外的正则化器,以避免过度拟合。最后,我们在传输模块的基础上构建了一个门控传输网络(GTN),并在六个不同的任务上实现了最新的结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号