首页> 外文会议>ICPR 2012;International Conference on Pattern Recognition >Closed-form information-theoretic divergences for statistical mixtures
【24h】

Closed-form information-theoretic divergences for statistical mixtures

机译:统计混合的闭合形式的信息理论差异

获取原文

摘要

Statistical mixtures such as Rayleigh, Wishart or Gaussian mixture models are commonly used in pattern recognition and signal processing tasks. Since the Kullback-Leibler divergence between any two such mixture models does not admit an analytical expression, the relative entropy can only be approximated numerically using time-consuming Monte-Carlo stochastic sampling. This drawback has motivated the quest for alternative information-theoretic divergences such as the recent Jensen-Rényi, Cauchy-Schwarz, or total square loss divergences that bypass the numerical approximations by providing exact analytic expressions. In this paper, we state sufficient conditions on the mixture distribution family so that these novel non-KL statistical divergences between any two such mixtures can be expressed in generic closed-form formulas.
机译:统计混合(例如瑞利,Wishart或高斯混合模型)通常用于模式识别和信号处理任务。由于任何两个这样的混合模型之间的Kullback-Leibler散度都不容许解析表达式,因此只能使用费时的蒙特卡洛随机抽样在数值上近似地得出相对熵。这一缺点促使人们寻求替代的信息理论差异,例如最近的Jensen-Rényi,Cauchy-Schwarz或通过提供精确的解析表达式而绕过数值逼近的总平方损耗差异。在本文中,我们陈述了混合物分布族的充分条件,以便可以用通用的封闭式公式来表示任意两种此类混合物之间的这些新的非KL统计差异。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号