首页> 外文会议>IEEE International Conference on Data Mining >Regularizing Deep Convolutional Neural Networks with a Structured Decorrelation Constraint
【24h】

Regularizing Deep Convolutional Neural Networks with a Structured Decorrelation Constraint

机译:用结构化的去相关性约束正规化深卷积神经网络

获取原文

摘要

Deep convolutional networks have achieved successful performance in data mining field. However, training large networks still remains a challenge, as the training data may be insufficient and the model can easily get overfitted. Hence the training process is usually combined with a model regularization. Typical regularizers include weight decay, Dropout, etc. In this paper, we propose a novel regularizer, named Structured Decorrelation Constraint (SDC), which is applied to the activations of the hidden layers to prevent overfitting and achieve better generalization. SDC impels the network to learn structured representations by grouping the hidden units and encouraging the units within the same group to have strong connections during the training procedure. Meanwhile, it forces the units in different groups to learn non-redundant representations by minimizing the cross-covariance between them. Compared with Dropout, SDC reduces the co-adaptions between the hidden units in an explicit way. Besides, we propose a novel approach called Reg-Conv that can help SDC to regularize the complex convolutional layers. Experiments on extensive datasets show that SDC significantly reduces overfitting and yields very meaningful improvements on classification performance (on CIFAR-10 6.22% accuracy promotion and on CIFAR-100 9.63% promotion).
机译:深度卷积网络在数据挖掘领域取得了成功的性能。然而,培训大网络仍然是一个挑战,因为训练数据可能不足,而模型可以容易地过度接收。因此,培训过程通常与模型正则化相结合。典型的校正器包括重量衰减,辍学等。在本文中,我们提出了一种名为结构化去相关性约束(SDC)的新型常规器,其应用于隐藏层的激活,以防止过度拟合并实现更好的泛化。 SDC驾驶网络通过分组隐藏的单位并鼓励同一组内的单位在培训程序中具有强大连接的单位来学习结构化表示。同时,它强制不同组中的单位通过最小化它们之间的交叉协方差来学习非冗余表示。与辍学相比,SDC以明确的方式减少隐藏单位之间的共同适应。此外,我们提出了一种称为Reg-Conv的新方法,可以帮助SDC正规化复杂的卷积层。广泛数据集的实验表明,SDC显着降低了过度拟合,对分类性能(CIFAR-10 6.22%的准确促进和CIFAR-100 9.63%的促销产生了非常有意义的改善。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号