首页> 外文会议>Annual conference on Neural Information Processing Systems >Stochastic Gradient Richardson-Romberg Markov Chain Monte Carlo
【24h】

Stochastic Gradient Richardson-Romberg Markov Chain Monte Carlo

机译:随机梯度Richardson-Romberg Markov Chain蒙特卡洛

获取原文

摘要

Stochastic Gradient Markov Chain Monte Carlo (SG-MCMC) algorithms have become increasingly popular for Bayesian inference in large-scale applications. Even though these methods have proved useful in several scenarios, their performance is often limited by their bias. In this study, we propose a novel sampling algorithm that aims to reduce the bias of SG-MCMC while keeping the variance at a reasonable level. Our approach is based on a numerical sequence acceleration method, namely the Richardson-Romberg extrapolation, which simply boils down to running almost the same SG-MCMC algorithm twice in parallel with different step sizes. We illustrate our framework on the popular Stochastic Gradient Langevin Dynamics (SGLD) algorithm and propose a novel SG-MCMC algorithm referred to as Stochastic Gradient Richardson-Romberg Langevin Dynamics (SGRRLD). We provide formal theoretical analysis and show that SGRRLD is asymptotically consistent, satisfies a central limit theorem, and its non-asymptotic bias and the mean squared-error can be bounded. Our results show that SGRRLD attains higher rates of convergence than SGLD in both finite-time and asymptotically, and it achieves the theoretical accuracy of the methods that are based on higher-order integrators. We support our findings using both synthetic and real data experiments.
机译:随机梯度马尔可夫链蒙特卡洛(SG-MCMC)算法在大规模应用中的贝叶斯推理中已变得越来越流行。尽管这些方法已在多种情况下证明是有用的,但其性能通常受到其偏见的限制。在这项研究中,我们提出了一种新颖的采样算法,旨在减少SG-MCMC的偏差,同时将方差保持在合理的水平。我们的方法基于数​​字序列加速方法,即Richardson-Romberg外推法,该方法简单地归结为以几乎相同的步长并行运行几乎相同的SG-MCMC算法两次。我们在流行的随机梯度兰格文动力学(SGLD)算法上说明了我们的框架,并提出了一种称为随机梯度Richardson-Romberg Langevin动力学(SGRRLD)的新型SG-MCMC算法。我们提供了形式上的理论分析,并表明SGRRLD是渐近一致的,满足中心极限定理,并且它的非渐近偏差和均方误差是有界的。我们的结果表明,SGRRLD在有限时间和渐近上都比SGLD具有更高的收敛速度,并且达到了基于高阶积分器的方法的理论精度。我们使用综合和真实数据实验来支持我们的发现。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号