【24h】

On Lifting the Gibbs Sampling Algorithm

机译:关于解除吉布斯采样算法

获取原文

摘要

First-order probabilistic models combine the power of first-order logic, the de facto tool for handling relational structure, with probabilistic graphical models, the de facto tool for handling uncertainty. Lifted probabilistic inference algorithms for them have been the subject of much recent research. The main idea in these algorithms is to improve the accuracy and scalability of existing graphical models' inference algorithms by exploiting symmetry in the first-order representation. In this paper, we consider blocked Gibbs sampling, an advanced MCMC scheme, and lift it to the first-order level. We propose to achieve this by partitioning the first-order atoms in the model into a set of disjoint clusters such that exact lifted inference is polynomial in each cluster given an assignment to all other atoms not in the cluster. We propose an approach for constructing the clusters and show how it can be used to trade accuracy with computational complexity in a principled manner. Our experimental evaluation shows that lifted Gibbs sampling is superior to the prepositional algorithm in terms of accuracy, scalability and convergence.
机译:一阶概率模型将一阶逻辑(用于处理关系结构的实际工具)的功能与概率图形模型(用于处理不确定性的实际工具)相结合。针对它们的提升的概率推理算法已经成为许多最近研究的主题。这些算法的主要思想是通过利用一阶表示中的对称性来提高现有图形模型推理算法的准确性和可伸缩性。在本文中,我们考虑了先进的MCMC方案阻塞Gibbs采样,并将其提升到一阶。我们建议通过将模型中的一阶原子划分为一组不相交的簇来实现这一点,从而在给定了不在簇中的所有其他原子的情况下,每个簇中的精确提升推论都是多项式。我们提出了一种构建聚类的方法,并展示了如何以有原则的方式将其用于以计算复杂性为代价来交换准确性。我们的实验评估表明,提升的Gibbs采样在准确性,可伸缩性和收敛性方面优于介词算法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号