首页> 外文会议>IEEE Conference on Multimedia Information Processing and Retrieval >Efficient Incremental Training for Deep Convolutional Neural Networks
【24h】

Efficient Incremental Training for Deep Convolutional Neural Networks

机译:深度卷积神经网络的高效增量培训

获取原文

摘要

While the deep convolutional neural networks (DCNNs) have shown excellent performance in various applications, such as image classification, training a DCNN model from scratch is computationally expensive and time consuming. In recent years, a lot of studies have been done to accelerate the training of DCNNs, but most of them were performed in a one-time manner. Considering the learning patterns of the human beings, people typically feel more comfortable to learn things in an incremental way and may be overwhelmed when absorbing a large amount of new information at once. Therefore, we demonstrate a new training schema that splits the whole training process into several sub-training steps. In this study, we propose an efficient DCNN training framework where we learn the new classes of concepts incrementally. The experiments are conducted on CIFAR-100 with VGG-19 as the backbone network. Our proposed framework demonstrates a comparable accuracy compared with the model trained from scratch and has shown 1.42x faster training speed.
机译:虽然深度卷积神经网络(DCNNS)在各种应用中显示出优异的性能,例如图像分类,但从划痕训练DCNN模型是计算昂贵且耗时的。近年来,已经完成了许多研究来加速DCNN的培训,但大多数以一次性方式进行。考虑到人类的学习模式,人们通常会觉得更舒适地以渐进方式学习,并且在一次吸收大量新信息时可能会不堪重负。因此,我们展示了一种新的培训模式,将整个培训过程分成几个子训练步骤。在这项研究中,我们提出了一个有效的DCNN培训框架,在那里我们逐步学习新的概念。实验在CIFAR-100上进行VGG-19作为骨干网。我们的拟议框架与从头开始培训的模型相比,展示了可比的准确性,并显示了1.42倍的训练速度。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号