首页> 外文会议>Annual Conference on Learning Theory(COLT 2006); 20060622-25; Pittsburgh,PA(US) >PAC Learning Axis-Aligned Mixtures of Gaussians with No Separation Assumption
【24h】

PAC Learning Axis-Aligned Mixtures of Gaussians with No Separation Assumption

机译:PAC学习高斯轴对齐混合,没有分离假设

获取原文
获取原文并翻译 | 示例

摘要

We propose and analyze a new vantage point for the learning of mixtures of Gaussians: namely, the PAC-style model of learning probability distributions introduced by Kearns et al. [13]. Here the task is to construct a hypothesis mixture of Gaussians that is statistically indistinguishable from the actual mixture generating the data; specifically, the KL divergence should be at most ε. In this scenario, we give a poly(n/ε) time algorithm that learns the class of mixtures of any constant number of axis-aligned Gaussians in R~n. Our algorithm makes no assumptions about the separation between the means of the Gaussians, nor does it have any dependence on the minimum mixing weight. This is in contrast to learning results known in the "clustering" model, where such assumptions are unavoidable. Our algorithm relies on the method of moments, and a subalgorithm developed in [9] for a discrete mixture-learning problem.
机译:我们提出并分析了一种新的高斯混合学习的优势点:即,由Kearns等人引入的PAC式学习概率分布模型。 [13]。这里的任务是构建高斯假设混合,该混合与生成数据的实际混合在统计上是无法区分的。具体而言,KL散度最多应为ε。在这种情况下,我们给出了一个poly(n /ε)时间算法,该算法可以学习R〜n中任意恒定数量的轴对齐高斯分布的混合类。我们的算法没有对高斯均值之间的分离做任何假设,也不对最小混合权重有任何依赖性。这与“聚类”模型中已知的学习结果形成了鲜明对比,在这种情况下,这种假设是不可避免的。我们的算法依赖于矩的方法,并且在[9]中针对离散混合学习问题开发了一个子算法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号