首页> 外文会议>Structural, syntactic, and statistical pattern recognition >Entropy-Based Variational Scheme for Fast Bayes Learning of Gaussian Mixtures
【24h】

Entropy-Based Variational Scheme for Fast Bayes Learning of Gaussian Mixtures

机译:基于熵的高斯混合物快速贝叶斯学习变分方案

获取原文
获取原文并翻译 | 示例

摘要

In this paper, we propose a fast entropy-based variational scheme for learning Gaussian mixtures. The key element of the proposal is to exploit the incremental learning approach to perform model selection through efficient iteration over the Variational Bayes (VB) optimization step in a way that the number of splits is minimized. In order to minimize the number of splits we only select for spliting the worse kernel in terms of evaluating its entropy. Recent Gaussian mixture learning proposals suggest the use of that mechanism if a bypass entropy estimator is available. Here we will exploit the recently proposed Leonenko estimator. Our experimental results, both in 2D and in higher dimension show the effectiveness of the approach which reduces an order of magnitude the computational cost of the state-of-the-art incremental component learners.
机译:在本文中,我们提出了一种基于熵的快速变分方案来学习高斯混合。该提案的关键要素是利用增量学习方法,通过以最小化分割数量的方式在变分贝叶斯(VB)优化步骤上进行有效迭代来执行模型选择。为了最大程度地减少拆分次数,我们仅选择在评估其熵方面对较差的内核进行拆分。最近的高斯混合学习建议建议,如果可以使用绕过熵估计器,则可以使用该机制。在这里,我们将利用最近提出的Leonenko估计量。我们在2D和更高维度上的实验结果都表明了该方法的有效性,该方法将最新的增量式组件学习器的计算成本降低了一个数量级。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号