首页> 外文期刊>Statistics and computing >Convergence analysis of herded-Gibbs-type sampling algorithms: effects of weight sharing
【24h】

Convergence analysis of herded-Gibbs-type sampling algorithms: effects of weight sharing

机译:牛群吉布斯型采样算法的收敛性分析:权重共享的影响

获取原文
获取原文并翻译 | 示例
获取外文期刊封面目录资料

摘要

Herded Gibbs (HG) and discretized herded Gibbs (DHG), which are Gibbs samplings combined with herding, are deterministic sampling algorithms for Markov random fields with discrete random variables. In this paper, we introduce the notion of "weight sharing" to systematically view these HG-type algorithms, and also investigate their convergence theoretically and numerically. We show that, by sharing and reducing the number of weight variables, the HG-type algorithm achieves fast initial convergence at the expense of asymptotic convergence. This means that the HG-type algorithm can be practically more efficient than conventional Markov chain Monte Carlo algorithms, although its estimate does not necessarily converge to the target asymptotically. Moreover, we decompose the numerical integration error of HG-type algorithms into several components and evaluate each of them in relation to herding and weight sharing. By using this formulation, we also propose novel variants of the HG-type algorithm that reduce the asymptotic bias.
机译:将吉布斯采样与成群相结合的成群吉布斯(HG)和离散化成群吉布斯(DHG)是具有离散随机变量的马尔可夫随机场的确定性采样算法。在本文中,我们引入“权重共享”概念来系统地查看这些HG型算法,并从理论和数值上研究它们的收敛性。我们表明,通过共享和减少权重变量的数量,HG型算法以渐近收敛为代价实现了快速的初始收敛。这意味着HG型算法实际上比常规的马尔可夫链蒙特卡罗算法更有效,尽管它的估计不一定会渐近收敛到目标。此外,我们将HG型算法的数值积分误差分解为几个组成部分,并评估它们与放牧和权重分配的关系。通过使用此公式,我们还提出了减少渐近偏差的HG型算法的新颖变体。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号