首页> 外文会议>IEEE International Conference on Acoustics, Speech and Signal Processing;ICASSP >Lower and upper bounds for approximation of the Kullback-Leibler divergence between Gaussian Mixture Models
【24h】

Lower and upper bounds for approximation of the Kullback-Leibler divergence between Gaussian Mixture Models

机译:高斯混合模型之间Kullback-Leibler散度的近似上下限

获取原文

摘要

Many speech technology systems rely on Gaussian Mixture Models (GMMs). The need for a comparison between two GMMs arises in applications such as speaker verification, model selection or parameter estimation. For this purpose, the Kullback-Leibler (KL) divergence is often used. However, since there is no closed form expression to compute it, it can only be approximated. We propose lower and upper bounds for the KL divergence, which lead to a new approximation and interesting insights into previously proposed approximations. An application to the comparison of speaker models also shows how such approximations can be used to validate assumptions on the models.
机译:许多语音技术系统都依赖于高斯混合模型(GMM)。在扬声器验证,模型选择或参数估计等应用中,需要对两个GMM进行比较。为此,通常使用Kullback-Leibler(KL)散度。但是,由于没有封闭形式的表达式可以计算出来,因此只能近似。我们提出了KL散度的下界和上限,这导致了一个新的近似,并且对以前提出的近似产生了有趣的见解。扬声器模型比较的应用程序还显示了如何将这种近似值用于验证模型的假设。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号