...
首页> 外文期刊>IEEE Transactions on Signal Processing >Gaussian Mixture Modeling by Exploiting the Mahalanobis Distance
【24h】

Gaussian Mixture Modeling by Exploiting the Mahalanobis Distance

机译:利用马氏距离建立高斯混合模型

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

In this paper, the expectation–maximization (EM) algorithm for Gaussian mixture modeling is improved via three statistical tests. The first test is a multivariate normality criterion based on the Mahalanobis distance of a sample measurement vector from a certain Gaussian component center. The first test is used in order to derive a decision whether to split a component into another two or not. The second test is a central tendency criterion based on the observation that multivariate kurtosis becomes large if the component to be split is a mixture of two or more underlying Gaussian sources with common centers. If the common center hypothesis is true, the component is split into two new components and their centers are initialized by the center of the (old) component candidate for splitting. Otherwise, the splitting is accomplished by a discriminant derived by the third test. This test is based on marginal cumulative distribution functions. Experimental results are presented against seven other EM variants both on artificially generated data-sets and real ones. The experimental results demonstrate that the proposed EM variant has an increased capability to find the underlying model, while maintaining a low execution time.
机译:在本文中,通过三个统计检验改进了用于高斯混合建模的期望最大化(EM)算法。第一个测试是基于样本测量向量到某个高斯分量中心的马哈拉诺比斯距离的多元正态性准则。为了确定是否将一个组件拆分为另外两个组件,使用了第一个测试。第二项检验是基于以下趋势的集中趋势判据:如果要拆分的分量是两个或更多个具有共同中心的基础高斯源的混合,则多元峰度会变大。如果共同中心假设为真,则将组件拆分为两个新组件,并通过(旧)候选组件的中心初始化其中心以进行拆分。否则,将通过第三次测试得出的判别式完成拆分。该检验基于边际累积分布函数。针对人工生成的数据集和实际的数据集,针对其他七个EM变体提供了实验结果。实验结果表明,提出的EM变体具有更高的功能来查找基础模型,同时保持了较低的执行时间。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号