首页> 美国政府科技报告 >Differential Metrics in Probability Spaces Based on Entropy and Divergence Measures
【24h】

Differential Metrics in Probability Spaces Based on Entropy and Divergence Measures

机译:基于熵和散度测度的概率空间微分度量

获取原文

摘要

This paper discussed some general methods of metrizing probability spaces through the introduction of a quadratic differential metric in the parameter manifold of a set of probability distributions. These methods extend the investigation made in Rao (1945) where the Fisher information matrix was used to construct the metric, and the geodesic distance was suggested as a measurement of dissimilarity between probability distributions. The basic approach in this paper is first to construct a divergence or a dissimilarity measure between any two probability distributions, and use it to derive a differential metric by considering two distributions whose characterizing parameters are close to each other. One measure of divergence considered is the Jensen difference based on an entropy functional as defined in Rao (1982). Another is the f-divergence measure studied by Csiszar. The latter class leads to the differential metric based on the Fisher informatin matrix. The geodesic distances based on this metric computed by various authors are listed.

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号