【24h】

Modulated Convolutional Networks

机译:调制卷积网络

获取原文

摘要

Despite great effectiveness of very deep and wide Convolutional Neural Networks (CNNs) in various computer vision tasks, the significant cost in terms of storage requirement of such networks impedes the deployment on computationally limited devices. In this paper, we propose new modulated convolutional networks (MCNs) to improve the portability of CNNs via binarized filters. In MCNs, we propose a new loss function which considers the filter loss, center loss and softmax loss in an end-to-end framework. We first introduce modulation filters (M-Filters) to recover the unbinarized filters, which leads to a new architecture to calculate the network model. The convolution operation is further approximated by considering intra-class compactness in the loss function. As a result, our MCNs can reduce the size of required storage space of convolutional filters by a factor of 32, in contrast to the full-precision model, while achieving much better performances than state-of-the-art binarized models. Most importantly, MCNs achieve a comparable performance to the full-precision Resnets and WideResnets. The code will be available publicly soon.
机译:尽管非常深入和广泛的卷积神经网络(CNN)在各种计算机视觉任务中具有巨大的效果,但是这种网络在存储需求方面的巨大成本阻碍了在计算受限的设备上的部署。在本文中,我们提出了新的调制卷积网络(MCN),以通过二进制滤波器提高CNN的可移植性。在MCN中,我们提出了一个新的损耗函数,该函数考虑了端到端框架中的滤波器损耗,中心损耗和softmax损耗。我们首先引入调制滤波器(M-Filters)以恢复未二值化的滤波器,这导致了一种用于计算网络模型的新架构。通过考虑损失函数中的类内紧凑性,可以进一步近似卷积运算。结果,与全精度模型相比,我们的MCN可以将卷积滤波器所需存储空间的大小减少32倍,同时实现比最新的二值化模型更好的性能。最重要的是,MCN的性能可与全称的Resnet和WideResnet媲美。该代码将很快公开发布。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号