...
首页> 外文期刊>SIAM Journal on Scientific Computing >LAPLACIAN SMOOTHING STOCHASTIC GRADIENT MARKOV CHAIN MONTE CARLO
【24h】

LAPLACIAN SMOOTHING STOCHASTIC GRADIENT MARKOV CHAIN MONTE CARLO

机译:拉普拉斯平滑随机梯度马尔可夫链蒙特卡罗

获取原文
获取原文并翻译 | 示例

摘要

As an important Markov chain Monte Carlo (MCMC) method, the stochastic gradient Langevin dynamics (SGLD) algorithm has achieved great success in Bayesian learning and posterior sampling. However, SGLD typically suffers from a slow convergence rate due to its large variance caused by the stochastic gradient. In order to alleviate these drawbacks, we leverage the recently developed Laplacian smoothing technique and propose a Laplacian smoothing stochastic gradient Langevin dynamics (LS-SGLD) algorithm. We prove that for sampling from both log-concave and non-log-concave densities, LS-SGLD achieves strictly smaller discretization error in 2-Wasserstein distance, although its mixing rate can be slightly slower. Experiments on both synthetic and real datasets verify our theoretical results and demonstrate the superior performance of LS-SGLD on different machine learning tasks including posterior sampling, Bayesian logistic regression, and training Bayesian convolutional neural networks. The code is available at https://github.com/BaoWangMath/LS-MCMC.
机译:None

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号