首页> 外文会议>International Conference on Quantitative Sciences and Its Applications >The derivation of Mutual Information and covariance function using centered random variables.
【24h】

The derivation of Mutual Information and covariance function using centered random variables.

机译:使用居中随机变量的互信息和协方差函数的推导。

获取原文

摘要

Information theoretic measures such as Mutual Information are often said to be able to measure nonlinear dependencies whereas covariance (and correlation) are able to measure only linear dependencies. We aim to illustrate this claim using centered random variables. The set of centered random variable F_c = {-(q-1)/2, -(q-1)/2+1,...,(q-1)/2-1,(q-1)/2} is mapped from F = {1,2 ..., q-1, q}. For q=2, we derive the relationship between the Mutual Information function, I, and the covariance function,Γ, and show that Γ=0→I=0. Furthermore we show that when q=3, the nonlinearities are captured by Mutual Information by highlighting a case where Γ=0 I=0.
机译:诸如相互信息的信息理论上是可以测量非线性依赖性的,而协方差(和相关性)能够仅测量线性依赖性。我们的目标是使用居中随机变量来说明这一索赔。居中随机变量f_c = { - (q-1)/ 2, - (q-1)/ 2 + 1,...,(q-1)/ 2-1,(q-1)/ 2 }从f = {1,2 ...,q-1,q}映射。对于Q = 2,我们导出了相互信息函数,I和协方差函数,γ的关系,并显示γ= 0→i = 0。此外,我们示出了当Q = 3时,通过突出显示γ= 0 i = 0的情况来通过互相信息捕获非线性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号