首页> 外文会议>Conference on uncertainty in artificial intelligence >Estimating Mutual Information by Local Gaussian Approximation
【24h】

Estimating Mutual Information by Local Gaussian Approximation

机译:通过局部高斯近似估计互信息

获取原文

摘要

Estimating mutual information (MI) from samples is a fundamental problem in statistics, machine learning, and data analysis. Recently it was shown that a popular class of non-parametric MI estimators perform very poorly for strongly dependent variables and have sample complexity that scales exponentially with the true MI. This undesired behavior was attributed to the reliance of those estimators on local uniformity of the underlying (and unknown) probability density function. Here we present a novel semi-parametric estimator of mutual information, where at each sample point, densities are locally approximated by a Gaussians distribution. We demonstrate that the estimator is asymptotically unbiased. We also show that the proposed estimator has a superior performance compared to several baselines, and is able to accurately measure relationship strengths over many orders of magnitude.
机译:从样本估计互信息(MI)是统计,机器学习和数据分析中的一个基本问题。最近显示,一类流行的非参数MI估计器在强因变量方面的表现非常差,并且样本复杂度与真实MI呈指数关系。这种不希望有的行为归因于这些估计量对基础(和未知)概率密度函数的局部均匀性的依赖。在这里,我们介绍了一种新颖的互信息半参数估计器,其中在每个采样点处,密度都通过高斯分布局部地近似。我们证明了估计量是渐近无偏的。我们还表明,与多个基准相比,拟议的估计器具有更高的性能,并且能够准确地测量多个数量级上的关系强度。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号