首页> 外文OA文献 >Ensemble Learning of Lightweight Deep Learning Models Using Knowledge Distillation for Image Classification
【2h】

Ensemble Learning of Lightweight Deep Learning Models Using Knowledge Distillation for Image Classification

机译:使用知识蒸馏进行图像分类的轻质深度学习模型的集合学习

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

In recent years, deep learning models have been used successfully in almost every field including both industry and academia, especially for computer vision tasks. However, these models are huge in size, with millions (and billions) of parameters, and thus cannot be deployed on the systems and devices with limited resources (e.g., embedded systems and mobile phones). To tackle this, several techniques on model compression and acceleration have been proposed. As a representative type of them, knowledge distillation suggests a way to effectively learn a small student model from large teacher model(s). It has attracted increasing attention since it showed its promising performance. In the work, we propose an ensemble model that combines feature-based, response-based, and relation-based lightweight knowledge distillation models for simple image classification tasks. In our knowledge distillation framework, we use ResNet−20 as a student network and ResNet−110 as a teacher network. Experimental results demonstrate that our proposed ensemble model outperforms other knowledge distillation models as well as the large teacher model for image classification tasks, with less computational power than the teacher model.
机译:近年来,在几乎每个领域都已经成功使用了深度学习模型,包括工业和学术界,特别是对于计算机愿景任务。然而,这些模型的大小巨大,具有数百万(和数十亿)的参数,因此不能部署有资源有限的系统和设备(例如,嵌入式系统和移动电话)。为了解决这个问题,已经提出了有关模型压缩和加速的几种技术。作为他们的代表类型,知识蒸馏表明,一种有效地学习大型教师模型的小型学生模型的方法。自成绩展示其有希望的表现以来,它引起了越来越长的关注。在工作中,我们提出了一个合并模型,它结合了基于特征的,基于响应的和基于关系的轻量级知识蒸馏模型,以实现简单的图像分类任务。在我们的知识蒸馏框架中,我们使用Reset-20作为学生网络和Resnet-110作为教师网络。实验结果表明,我们所提出的集合模型优于其他知识蒸馏模型以及用于图像分类任务的大型教师模型,而不是教师模型的计算能力较少。

著录项

  • 作者

    Jaeyong Kang; Jeonghwan Gwak;

  • 作者单位
  • 年度 2020
  • 总页数
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号