首页> 外文会议>International Conference on Image and Graphics >Accelerating Deep Convnets via Sparse Subspace Clustering
【24h】

Accelerating Deep Convnets via Sparse Subspace Clustering

机译:通过稀疏子空间聚类加速深扫描

获取原文

摘要

While the research on convolutional neural networks (CNNs) is progressing quickly, the real-world deployment of these models is often limited by computing resources and memory constraints. In this paper, we address this issue by proposing a novel filter pruning method to compress and accelerate CNNs. Our method reduces the redundancy in one convolutional layer by applying sparse subspace clustering to its output feature maps. In this way, most of the representative information in the network can be retained in each cluster. Therefore, our method provides an effective solution to filter pruning for which most existing methods directly remove filters based on simple heuristics. The proposed method is independent of the network structure, and thus it can be adopted by any off-the-shelf deep learning libraries. Evaluated on VGG-16 and ResNet-50 using ImageNet, our method outperforms existing techniques before fine-tuning, and achieves state-of-the-art results after fine-tuning.
机译:虽然对卷积神经网络(CNNS)的研究进展迅速,但是通过计算资源和内存约束,这些模型的实际部署通常是有限的。在本文中,我们通过提出一种新颖的滤波修剪方法来解决这个问题以压缩和加速CNN。我们的方法通过将稀疏子空间聚类应用于其输出特征映射,降低了一个卷积层中的冗余。以这种方式,网络中的大多数代表信息可以保留在每个群集中。因此,我们的方法提供了有效的解决方案,以滤除大多数现有方法的修剪基于简单的启发式方法。所提出的方法与网络结构无关,因此可以由任何现成的深层学习库采用。使用ImageNet对VGG-16和Reset-50进行评估,我们的方法在微调之前优于现有技术,并在微调后实现最先进的结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号