【24h】

Wide Compression: Tensor Ring Nets

机译:宽压缩:张量环网

获取原文

摘要

Deep neural networks have demonstrated state-of-the-art performance in a variety of real-world applications. In order to obtain performance gains, these networks have grown larger and deeper, containing millions or even billions of parameters and over a thousand layers. The tradeoff is that these large architectures require an enormous amount of memory, storage, and computation, thus limiting their usability. Inspired by the recent tensor ring factorization, we introduce Tensor Ring Networks (TR-Nets), which significantly compress both the fully connected layers and the convolutional layers of deep neural networks. Our results show that our TR-Nets approach is able to compress LeNet-5 by 11× without losing accuracy, and can compress the state-of-the-art Wide ResNet by 243× with only 2.3% degradation in Cifar10 image classification. Overall, this compression scheme shows promise in scientific computing and deep learning, especially for emerging resource-constrained devices such as smartphones, wearables, and IoT devices.
机译:深度神经网络已在各种实际应用中展示了最先进的性能。为了获得性能提升,这些网络变得越来越大,越来越深,包含数百万甚至数十亿个参数,并且覆盖了上千个层。折衷是这些大型体系结构需要大量的内存,存储和计算,因此限制了它们的可用性。受近期张量环分解的启发,我们引入了Tensor环网(TR-Nets),该网可显着压缩深度神经网络的完全连接层和卷积层。我们的结果表明,我们的TR-Nets方法能够将LeNet-5压缩11倍而不会降低精度,并且可以将最新的Wide ResNet压缩243倍,而Cifar10图像分类的退化仅为2.3%。总体而言,这种压缩方案在科学计算和深度学习中显示出了希望,尤其是对于新兴资源受限的设备,例如智能手机,可穿戴设备和IoT设备。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号