首页> 外文期刊>JMLR: Workshop and Conference Proceedings >Mutual Information Neural Estimation
【24h】

Mutual Information Neural Estimation

机译:相互信息神经估计

获取原文
           

摘要

We argue that the estimation of mutual information between high dimensional continuous random variables can be achieved by gradient descent over neural networks. We present a Mutual Information Neural Estimator (MINE) that is linearly scalable in dimensionality as well as in sample size, trainable through back-prop, and strongly consistent. We present a handful of applications on which MINE can be used to minimize or maximize mutual information. We apply MINE to improve adversarially trained generative models. We also use MINE to implement the Information Bottleneck, applying it to supervised classification; our results demonstrate substantial improvement in flexibility and performance in these settings.
机译:我们认为,通过神经网络梯度下降可以实现高维连续变量之间的互信息之间的估计。我们提出了一种相互信息的神经估算器(矿山),其在维度和样品大小中线性可扩展,通过背部支柱可训练,并强烈一致。我们展示了矿井可用于最小化或最大化相互信息的应用程序。我们申请我的改善普遍训练的生成模型。我们还将我的实施信息瓶颈实施,将其应用于监督分类;我们的结果表明这些设置中的灵活性和性能大量提高。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号