...
首页> 外文期刊>Neurocomputing >Cost-conscious mutual information maximization for improving collective interpretation of multi-layered neural networks
【24h】

Cost-conscious mutual information maximization for improving collective interpretation of multi-layered neural networks

机译:改善多层神经网络集体解释的成本有意识的互信息最大化

获取原文
获取原文并翻译 | 示例
           

摘要

The present paper aims to improve the collective interpretation realized by compressing multi-layered neural networks and to make the interpretation as natural and stable as possible. We collectively interpret the final representations by maximizing mutual information between inputs and neurons, expecting that mutual information maximization can disentangle complex features into simpler ones. However, we have had difficulty in increasing mutual information and in obtaining interpretable features for several data sets. By examining closely the processes of information maximization, we found that, in addition to the information maximization, we need to consider the cost associated with this information maximization. Thus, we try to maximize not simply mutual information but the ratio of mutual information to the cost, and this method can be called ''cost-conscious mutual information maximization." The cost-conscious method aims to extend Linsker's maximum information preservation principle to a variety of data sets by more directly taking into account the cost associated with the process of information maximization. The method was applied to two data sets: the artificial and symmetric data set and the credit default data set. First, by using the symmetric data set injected with random noises, the cost-conscious information maximization method could extract the symmetric property almost perfectly against the random noises. In the experimental results on the credit default data set, the present method could make it possible to interpret the final results the most naturally, showing why and how the credit default could occur very naturally. The experimental results show that the neural networks can be used to interpret data sets more naturally than the conventional methods such as the logistic regression analysis. (C) 2020 Elsevier B.V. All rights reserved.
机译:本文旨在改善通过压缩多层神经网络的集体解释,并使解释为自然且稳定。我们通过最大化输入和神经元之间的互信息来集体解释最终的表示,期望相互信息最大化可以解开复杂的功能进入更简单的功能。但是,我们难以增加互信息以及获得几个数据集的可解释功能。通过仔细检查信息最大化的过程,我们发现,除了信息最大化之外,我们还需要考虑与此信息最大化相关的成本。因此,我们尝试最大化不仅仅是互信息,而是可以称为成本的相互信息的比率,并且这种方法可以称为“成本有意识的共同信息最大化”。“成本意识的方法旨在扩展Linsker的最大信息保存原则。”通过更直接地考虑与信息最大化过程相关的各种数据集。该方法应用于两个数据集:人工和对称数据集和信用默认数据集。首先,通过使用对称数据设置随机噪声的设置,成本清醒的信息最大化方法可以将对称性的性质提取几乎完全针对随机噪声。在信用默认数据集的实验结果中,本方法可以使最终结果解释最终结果当然,展示为什么和信用违约可能会非常自然。实验结果表明,神经网络可以b E用于更自然地解释数据集,而不是逻辑回归分析等传统方法。 (c)2020 Elsevier B.v.保留所有权利。

著录项

  • 来源
    《Neurocomputing》 |2020年第7期|259-274|共16页
  • 作者

    Kamimura Ryotaro;

  • 作者单位

    Kumamoto Drone Technol & Dev Fdn Techno Res Pk Techno Lab 203 1155-12 Tabaru Kumamoto 8612202 Japan|Tokai Univ IT Educ Ctr 4-1-1 Kitakaname Hiratsuka Kanagawa 2591292 Japan;

  • 收录信息 美国《科学引文索引》(SCI);美国《工程索引》(EI);
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号