$lpha $ Understanding Convolutional Neural Networks With Information Theory: An Initial Exploration
首页> 外文期刊>Neural Networks and Learning Systems, IEEE Transactions on >Understanding Convolutional Neural Networks With Information Theory: An Initial Exploration
【24h】

Understanding Convolutional Neural Networks With Information Theory: An Initial Exploration

机译:了解信息理论的卷积神经网络:初步探索

获取原文
获取原文并翻译 | 示例
           

摘要

A novel functional estimator for Rényi’s $lpha $ -entropy and its multivariate extension was recently proposed in terms of the normalized eigenspectrum of a Hermitian matrix of the projected data in a reproducing kernel Hilbert space (RKHS). However, the utility and possible applications of these new estimators are rather new and mostly unknown to practitioners. In this brief, we first show that this estimator enables straightforward measurement of information flow in realistic convolutional neural networks (CNNs) without any approximation. Then, we introduce the partial information decomposition (PID) framework and develop three quantities to analyze the synergy and redundancy in convolutional layer representations. Our results validate two fundamental data processing inequalities and reveal more inner properties concerning CNN training.
机译:fényi's $ alpha $ --entropy及其多变量扩展,最近提出了预计数据的隐士矩阵的归一代eAGenspectrum在再现内核希尔伯特空间(RKHS)中。但是,这些新估算器的实用性和可能的​​应用是从业者的新且主要是未知的。在此简介中,我们首先表明该估计器能够直接测量现实卷积神经网络(CNNS)中的信息流量而没有任何近似。然后,我们介绍了部分信息分解(PID)框架,并开发了三个数量来分析卷积层表示中的协同和冗余。我们的结果验证了两个基本数据处理不等式,并揭示了关于CNN培训的更多内部属性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号