首页> 外文会议>IEEE International Conference on Data Mining >Regularizing Deep Convolutional Neural Networks with a Structured Decorrelation Constraint
【24h】

Regularizing Deep Convolutional Neural Networks with a Structured Decorrelation Constraint

机译:具有结构化去相关约束的深度卷积神经网络正则化

获取原文

摘要

Deep convolutional networks have achieved successful performance in data mining field. However, training large networks still remains a challenge, as the training data may be insufficient and the model can easily get overfitted. Hence the training process is usually combined with a model regularization. Typical regularizers include weight decay, Dropout, etc. In this paper, we propose a novel regularizer, named Structured Decorrelation Constraint (SDC), which is applied to the activations of the hidden layers to prevent overfitting and achieve better generalization. SDC impels the network to learn structured representations by grouping the hidden units and encouraging the units within the same group to have strong connections during the training procedure. Meanwhile, it forces the units in different groups to learn non-redundant representations by minimizing the cross-covariance between them. Compared with Dropout, SDC reduces the co-adaptions between the hidden units in an explicit way. Besides, we propose a novel approach called Reg-Conv that can help SDC to regularize the complex convolutional layers. Experiments on extensive datasets show that SDC significantly reduces overfitting and yields very meaningful improvements on classification performance (on CIFAR-10 6.22% accuracy promotion and on CIFAR-100 9.63% promotion).
机译:深度卷积网络已经在数据挖掘领域取得了成功的表现。但是,训练大型网络仍然是一个挑战,因为训练数据可能不足,并且模型很容易变得过拟合。因此,训练过程通常与模型正则化相结合。典型的正则化器包括权重衰减,Dropout等。在本文中,我们提出了一种新颖的正则化器,称为结构化去相关约束(SDC),该正则化器应用于隐藏层的激活以防止过度拟合并获得更好的泛化。 SDC通过对隐藏的单元进行分组并鼓励同一组中的单元在训练过程中建立牢固的联系来促使网络学习结构化表示。同时,它通过最小化它们之间的互协方差来迫使不同组中的单元学习非冗余表示。与Dropout相比,SDC以显式方式减少了隐藏单元之间的协同适应。此外,我们提出了一种称为Reg-Conv的新方法,该方法可以帮助SDC规范化复杂的卷积层。在大量数据集上进行的实验表明,SDC显着减少了过拟合,并在分类性能上产生了非常有意义的改进(在CIFAR-10上提高了6.22%的准确性,在CIFAR-100上提高了9.63%)。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号