首页> 外文会议>International Conference on Algorithmic Learning Theory(ALT 2005); 20051008-11; Singapore(SG) >Stochastic Complexity for Mixture of Exponential Families in Variational Bayes
【24h】

Stochastic Complexity for Mixture of Exponential Families in Variational Bayes

机译:贝叶斯指数族混合指数的随机复杂性

获取原文
获取原文并翻译 | 示例

摘要

The Variational Bayesian learning, proposed as an approximation of the Bayesian learning, has provided computational tractability and good generalization performance in many applications. However, little has been done to investigate its theoretical properties. In this paper, we discuss the Variational Bayesian learning of the mixture of exponential families and derive the asymptotic form of the stochastic complexities. We show that the stochastic complexities become smaller than those of regular statistical models, which implies the advantage of the Bayesian learning still remains in the Variational Bayesian learning. Stochastic complexity, which is called the marginal likelihood or the free energy, not only becomes important in addressing the model selection problem but also enables us to discuss the accuracy of the Variational Bayesian approach as an approximation of the true Bayesian learning.
机译:作为贝叶斯学习的近似而提出的变分贝叶斯学习在许多应用程序中提供了计算可处理性和良好的泛化性能。但是,几乎没有做过研究其理论性质。在本文中,我们讨论了指数族混合的变分贝叶斯学习,并得出了随机复杂性的渐近形式。我们表明,随机复杂度变得比常规统计模型要小,这意味着贝叶斯学习的优势仍然保留在变分贝叶斯学习中。随机复杂度(称为边际可能性或自由能)不仅在解决模型选择问题方面变得很重要,而且使我们能够讨论变分贝叶斯方法作为真实贝叶斯学习近似方法的准确性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号