...
首页> 外文期刊>PeerJ Computer Science >Knowledge distillation in deep learning and its applications
【24h】

Knowledge distillation in deep learning and its applications

机译:深度学习知识蒸馏及其应用

获取原文
   

获取外文期刊封面封底 >>

       

摘要

Deep learning based models are relatively large, and it is hard to deploy such models on resource-limited devices such as mobile phones and embedded devices. One possible solution is knowledge distillation whereby a smaller model (student model) is trained by utilizing the information from a larger model (teacher model). In this paper, we present an outlook of knowledge distillation techniques applied to deep learning models. To compare the performances of different techniques, we propose a new metric called distillation metric which compares different knowledge distillation solutions based on models' sizes and accuracy scores. Based on the survey, some interesting conclusions are drawn and presented in this paper including the current challenges and possible research directions.
机译:基于深度学习的模型相对较大,并且很难在诸如移动电话和嵌入式设备之类的资源限制设备上部署这些模型。 一种可能的解决方案是知识蒸馏,由此通过利用来自更大模型(教师模型)的信息来训练较小的模型(学生模型)。 在本文中,我们展示了应用于深度学习模型的知识蒸馏技术。 为了比较不同技术的性能,我们提出了一种新的公制,称为蒸馏度量,基于模型的尺寸和精度分数比较不同的知识蒸馏解决方案。 根据调查,在本文中绘制并提出了一些有趣的结论,包括当前的挑战和可能的研究方向。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号