【24h】

Dilation of Chisini-Jensen-Shannon Divergence

机译:Chisini-Jensen-Shannon发散的扩张

获取原文

摘要

Jensen-Shannon divergence (JSD) does not provide adequate separation when the difference between input distributions is subtle. A recently introduced technique, Chisini Jensen Shannon Divergence (CJSD), increases JSD's ability to discriminate between probability distributions by reformulating with operators from Chisini mean. As a consequence, CJSDs also carry additional properties concerning robustness. The utility of this approach was validated in the form of two SVM kernels that give superior classification performance. Our work explores why the performance improvement to JSDs is afforded by this reformulation. We characterize the nature of this improvement based on the idea of relative dilation, that is how Chisini mean transforms JSD's range and prove a number of propositions that establish the degree of this separation. Finally, we provide empirical validation on a synthetic dataset that confirms our theoretical results pertaining to relative dilation.
机译:当输入分布之间的差异很小时,Jensen-Shannon散度(JSD)无法提供足够的分离。最近引入的一项技术Chisini Jensen Shannon Divergence(CJSD)通过与Chisini均值运算符进行重新组合,提高了JSD区分概率分布的能力。结果,CJSD还具有与健壮性有关的其他属性。以两个具有出色分类性能的SVM内核的形式验证了此方法的实用性。我们的工作探讨了为什么通过这种重新制定可以提高JSD的性能。我们基于相对扩张的思想来描述这种改进的性质,这就是Chisini的意思是如何改变JSD的范围并证明建立这种分离程度的许多命题。最后,我们在综合数据集上提供了经验验证,证实了我们有关相对膨胀的理论结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号