首页> 外文会议>IEEE International Conference on Acoustics, Speech and Signal Processing >Stochastic thermodynamic integration: Efficient Bayesian model selection via stochastic gradient MCMC
【24h】

Stochastic thermodynamic integration: Efficient Bayesian model selection via stochastic gradient MCMC

机译:随机热力学积分:通过随机梯度MCMC选择有效的贝叶斯模型

获取原文

摘要

Model selection is a central topic in Bayesian machine learning, which requires the estimation of the marginal likelihood of the data under the models to be compared. During the last decade, conventional model selection methods have lost their charm as they have high computational requirements. In this study, we propose a computationally efficient model selection method by integrating ideas from Stochastic Gradient Markov Chain Monte Carlo (SG-MCMC) literature and statistical physics. As opposed to conventional methods, the proposed method has very low computational needs and can be implemented almost without modifying existing SG-MCMC code. We provide an upper-bound for the bias of the proposed method. Our experiments show that, our method is 40 times as fast as the baseline method on finding the optimal model order in a matrix factorization problem.
机译:模型选择是贝叶斯机器学习中的核心主题,这需要估计要比较模型下数据的边缘可能性。在过去十年中,传统的模型选择方法已经失去了魅力,因为它们具有高的计算要求。在这项研究中,我们通过从随机梯度马尔可夫链蒙特卡罗(SG-MCMC)文献和统计物理学的思想集成了计算有效的模型选择方法。与传统方法相反,所提出的方法具有非常低的计算需求,并且可以在不修改现有的SG-MCMC代码的情况下实现非常低。我们为所提出的方法的偏置提供了一个上限。我们的实验表明,我们的方法是在矩阵分解问题中找到最佳模型顺序的基线方法的快40倍。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号