首页> 外文会议>International Joint Conference on Artificial Intelligence >Training Group Orthogonal Neural Networks with Privileged Information
【24h】

Training Group Orthogonal Neural Networks with Privileged Information

机译:具有特权信息的培训组正交神经网络

获取原文

摘要

Learning rich and diverse representations is critical for the performance of deep convolutional neural networks (CNNs). In this paper, we consider how to use privileged information to promote inherent diversity of a single CNN model such that the model can learn better representations and offer stronger generalization ability. To this end, we propose a novel group orthogonal convolutional neural network (GoCNN) that learns untangled representations within each layer by exploiting provided privileged information and enhances representation diversity effectively. We take image classification as an example where image segmentation annotations are used as privileged information during the training process. Experiments on two benchmark datasets - ImageNet and PASCAL VOC - clearly demonstrate the strong generalization ability of our proposed GoCNN model. On the ImageNet dataset, GoCNN improves the performance of state-of-the-art ResNet-152 model by absolute value of 1.2% while only uses privileged information of 10% of the training images, confirming effectiveness of GoCNN on utilizing available privileged knowledge to train better CNNs.
机译:学习富裕和多样化的表示对于深度卷积神经网络(CNNS)的表现至关重要。在本文中,我们考虑如何使用特权信息来促进单个CNN模型的内在多样性,使得模型可以了解更好的表示并提供更强的泛化能力。为此,我们提出了一种新的小组正交卷积神经网络(GOCNN),通过利用提供的特权信息来学习每个层内的未解体表示,并有效增强表示分集。我们将图像分类作为培训过程中使用图像分段注释用作特权信息的示例。两个基准数据集的实验 - Imagenet和Pascal VOC - 清楚地展示了我们提出的GOCNN模型的强大泛化能力。在ImageNet DataSet上,Gocnn通过绝对值的值提高了最先进的Reset-152模型的性能1.2%,而只使用10%的培训图像的特权信息,确认GoCnn在利用可用特权知识时的有效性训练更好的CNN。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号