【24h】

On learning statistical mixtures maximizing the complete likelihood

机译:关于学习统计混合物,最大化完全可能性

获取原文

摘要

Statistical mixtures arc scmi-paramctric models ubiquitously met in data scicncc sincc they can universally model smooth densities arbitrarily closcly. Finite mixtures arc usually inferred from data using the celebrated Expcctation-Maximization framework that locally itcrativcly maximizes the incomplete likelihood by assigning softly data to mixture components. In this paper, wc present a novel methodology to infer mixtures by transforming the learning problem into a sequcncc of gcomctric ccntcr-bascd hard clustering problems that provably maximizes monotonically the complete likelihood. Our versatile method is fast and uses low memory footprint: The corc inner steps can be implemented using various generalized k-mcans type hcuristics. Thus wc can leverage rcccnt results on clustering to mixture learning. In particular, for mixtures of singly-paramctric distributions including for example the Raylcigh, Wcibull, or Poisson distributions, wc show how to use dynamic programming to solve cxactly the inner gcomctric clustering problems. Wc discuss on several extensions of the methodology.
机译:统计混合物ARC SCMI-PAREMCTRIC模型在数据SCICNCC SINCC中普遍存在,他们可以通过普遍互联地互相挖掘平滑密度。有限的混合物通常通过使用庆祝的expcctation-maximization框架从数据推断出通过将软化数据分配给混合组件来局部地意识到不完整的可能性。在本文中,WC通过将学习问题转换为GComctric CCNTCR-BASCD硬质聚类问题的SECCNCC,提出了一种新的方法来推断混合物,这些问题可在单调的完全可能性中显着最大化。我们的多功能方法快速,使用低内存占用:CORC内部步骤可以使用各种广义K-MCAN型HCuristics实现。因此,WC可以利用RCCCNT结果对混合学习进行聚类。特别地,对于单个参数分布的混合物,包括例如Raylcigh,Wcibull或Poisson分布,WC示出了如何使用动态编程来解决内部GComctric集群问题。 WC讨论了方法的几个扩展。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号