...
首页> 外文期刊>Neurocomputing >Generalizing expectation propagation with mixtures of exponential family distributions and an application to Bayesian logistic regression
【24h】

Generalizing expectation propagation with mixtures of exponential family distributions and an application to Bayesian logistic regression

机译:通过指数家庭分布的混合物推广预期传播及贝叶斯逻辑回归的应用

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

Expectation propagation (EP) is a widely used deterministic approximate inference algorithm in Bayesian machine learning. Traditional EP approximates an intractable posterior distribution through a set of local approximations which are updated iteratively. In this paper, we propose a generalized version of EP called generalized EP (GEP), which is a new method based on the minimization of KL divergence for approximate inference. However, when the variance of the gradient is large, the algorithm may need a long time to converge. We use control variates and develop a variance reduced version of this method called GEP-CV. We evaluate our approach on Bayesian logistic regression, which provides faster convergence and better performance than other state-of-the-art approaches. (C) 2019 Elsevier B.V. All rights reserved.
机译:期望传播(EP)是贝叶斯机器学习中广泛使用的确定性近似推理算法。传统的EP近似于通过迭代更新的一组局部逼近的静置后部分布。在本文中,我们提出了一种称为广义EP(GEP)的EP的广义版本,这是一种基于近似推理的KL发散的最小化的新方法。然而,当梯度的方差很大时,算法可能需要很长时间才能收敛。我们使用控制变体并开发出可称为GEP-CV的方法的差异减少版本。我们评估了我们对贝叶斯逻辑回归的方法,提供了更快的收敛性和更好的性能,而不是其他最先进的方法。 (c)2019 Elsevier B.v.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号