首页> 外文期刊>Neural processing letters >Information Bottleneck Theory on Convolutional Neural Networks
【24h】

Information Bottleneck Theory on Convolutional Neural Networks

机译:卷积神经网络的信息瓶颈理论

获取原文
获取原文并翻译 | 示例
           

摘要

Recent years, many researches attempt to open the black box of deep neural networks and propose a various of theories to understand it. Among them, information bottleneck (IB) theory claims that there are two distinct phases consisting of fitting phase and compression phase in the course of training. This statement attracts many attentions since its success in explaining the inner behavior of feedforward neural networks. In this paper, we employ IB theory to understand the dynamic behavior of convolutional neural networks (CNNs) and investigate how the fundamental features such as convolutional layer width, kernel size, network depth, pooling layers and multi-fully connected layer have impact on the performance of CNNs. In particular, through a series of experimental analysis on benchmark of MNIST and Fashion-MNIST, we demonstrate that the compression phase is not observed in all these cases. This shows us the CNNs have a rather complicated behavior than feedforward neural networks.
机译:近年来,许多研究试图打开黑匣子的深层神经网络,并提出各种理论要理解它。 其中,信息瓶颈(IB)理论声称,在训练过程中,存在两个不同的阶段,包括拟合相和压缩阶段。 这一陈述吸引了许多关注,因为它在解释前馈神经网络的内在行为方面取得了成功。 在本文中,我们采用IB理论来了解卷积神经网络(CNNS)的动态行为,并调查如何卷积层宽度,内核大小,网络深度,池层和多完全连接层的基本特征如何对其产生影响 CNN的表现。 特别是,通过关于Mnist和Fashion-Mnist的基准测试的一系列实验分析,我们证明在所有这些情况下未观察到压缩阶段。 这向我们展示了比前馈神经网络具有相当复杂的行为的CNN。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号