首页> 外文期刊>電子情報通信学会技術研究報告. ニュ-ロコンピュ-ティング. Neurocomputing >Information Divergences in Local Variational Approximation of Bayesian Posterior Distribution
【24h】

Information Divergences in Local Variational Approximation of Bayesian Posterior Distribution

机译:Information Divergences in Local Variational Approximation of Bayesian Posterior Distribution

获取原文
获取原文并翻译 | 示例
       

摘要

Local variational method is a technique to approximate intractable posterior distributions in Bayesian learning. In this article, we show that the objective functions in the local variational approximation are decomposed into the sum of the Kullback information and the Bregman divergence from the approximating posterior distribution to the Bayesian posterior distribution. This provides a generic insight into this approximation method. Based on a geometrical argument in the space of approximating posteriors, we propose an efficient method to evaluate an upper bound of the marginal likelihood. We demonstrate its effectiveness through the application to the event rate estimation.

著录项

获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号