首页> 外文会议>Annual conference on Neural Information Processing Systems >Memoized Online Variational Inference for Dirichlet Process Mixture Models
【24h】

Memoized Online Variational Inference for Dirichlet Process Mixture Models

机译:对Dirichlet工艺混合模型的核对在线变分推理

获取原文
获取外文期刊封面目录资料

摘要

Variational inference algorithms provide the most effective framework for large-scale training of Bayesian nonparametric models. Stochastic online approaches are promising, but are sensitive to the chosen learning rate and often converge to poor local optima. We present a new algorithm, memoized online variational inference, which scales to very large (yet finite) datasets while avoiding the complexities of stochastic gradient. Our algorithm maintains finite-dimensional sufficient statistics from batches of the full dataset, requiring some additional memory but still scaling to millions of examples. Exploiting nested families of variational bounds for infinite nonparametric models, we develop principled birth and merge moves allowing non-local optimization. Births adaptively add components to the model to escape local optima, while merges remove redundancy and improve speed. Using Dirichlet process mixture models for image clustering and denoising, we demonstrate major improvements in robustness and accuracy.
机译:变推理算法提供了贝叶斯非参数模型的大规模培训是最有效的框架。随机网上的方法是有希望的,但所选择的学习率敏感,经常收敛到可怜的局部最优解。本文提出了一种新的算法,memoized线上变分推断,其规模非常大的(但有限的)数据集,同时避免了随机梯度的复杂性。我们的算法从完整数据集的批保持有限维足够的统计数据,需要一些额外的内存,但仍扩展到数以百万计的例子。开拓无限非参数模型变界的嵌套族,我们开发原则的诞生和合并移动允许非本地优化。出生自适应组件添加到模型逃离局部最优,而合并去除冗余,提高速度。使用Dirichlet过程混合模型的图像集群和去噪,我们展示的鲁棒性和准确性的重大改进。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号