...
首页> 外文期刊>Neurocomputing >Generalizing expectation propagation with mixtures of exponential family distributions and an application to Bayesian logistic regression
【24h】

Generalizing expectation propagation with mixtures of exponential family distributions and an application to Bayesian logistic regression

机译:用指数族分布的混合来推广期望传播及其在贝叶斯逻辑回归中的应用

获取原文
获取原文并翻译 | 示例
           

摘要

Expectation propagation (EP) is a widely used deterministic approximate inference algorithm in Bayesian machine learning. Traditional EP approximates an intractable posterior distribution through a set of local approximations which are updated iteratively. In this paper, we propose a generalized version of EP called generalized EP (GEP), which is a new method based on the minimization of KL divergence for approximate inference. However, when the variance of the gradient is large, the algorithm may need a long time to converge. We use control variates and develop a variance reduced version of this method called GEP-CV. We evaluate our approach on Bayesian logistic regression, which provides faster convergence and better performance than other state-of-the-art approaches. (C) 2019 Elsevier B.V. All rights reserved.
机译:期望传播(EP)是贝叶斯机器学习中广泛使用的确定性近似推理算法。传统的EP通过迭代更新的一组局部近似值来近似难处理的后验分布。在本文中,我们提出了EP的广义版本,称为广义EP(GEP),这是一种基于最小化KL散度进行近似推理的新方法。但是,当梯度的方差很大时,该算法可能需要很长时间才能收敛。我们使用控制变量,并开发了称为GEP-CV的这种方法的方差缩减版本。我们在贝叶斯逻辑回归中评估了我们的方法,该方法比其他最新方法提供更快的收敛性和更好的性能。 (C)2019 Elsevier B.V.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号