首页> 外文会议>Conference on Multimedia Information Processing and Retrieval >Efficient Incremental Training for Deep Convolutional Neural Networks
【24h】

Efficient Incremental Training for Deep Convolutional Neural Networks

机译:深度卷积神经网络的有效增量训练

获取原文

摘要

While the deep convolutional neural networks (DCNNs) have shown excellent performance in various applications, such as image classification, training a DCNN model from scratch is computationally expensive and time consuming. In recent years, a lot of studies have been done to accelerate the training of DCNNs, but most of them were performed in a one-time manner. Considering the learning patterns of the human beings, people typically feel more comfortable to learn things in an incremental way and may be overwhelmed when absorbing a large amount of new information at once. Therefore, we demonstrate a new training schema that splits the whole training process into several sub-training steps. In this study, we propose an efficient DCNN training framework where we learn the new classes of concepts incrementally. The experiments are conducted on CIFAR-100 with VGG-19 as the backbone network. Our proposed framework demonstrates a comparable accuracy compared with the model trained from scratch and has shown 1.42x faster training speed.
机译:尽管深度卷积神经网络(DCNN)在各种应用(例如图像分类)中显示了出色的性能,但是从头开始训练DCNN模型在计算上却是昂贵且费时的。近年来,为了加速DCNN的训练,已经进行了许多研究,但是大多数研究都是一次性进行的。考虑到人类的学习模式,人们通常以渐进的方式学习事物更自在,并且在一次吸收大量新信息时可能会不知所措。因此,我们演示了一种新的训练方案,该方案将整个训练过程分为几个子训练步骤。在这项研究中,我们提出了一个有效的DCNN培训框架,在其中我们可以逐步学习新的概念类别。实验是在以VGG-19为骨干网络的CIFAR-100上进行的。与从头开始训练的模型相比,我们提出的框架展示了可比的准确性,并显示出1.42倍的训练速度。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号