首页> 外文会议>International Conference on Algorithmic Learning Theory >Stochastic Complexity for Mixture of Exponential Families in Variational Bayes
【24h】

Stochastic Complexity for Mixture of Exponential Families in Variational Bayes

机译:变分贝叶斯中指数家庭混合的随机复杂性

获取原文

摘要

The Variational Bayesian learning, proposed as an approximation of the Bayesian learning, has provided computational tractability and good generalization performance in many applications. However, little has been done to investigate its theoretical properties. In this paper, we discuss the Variational Bayesian learning of the mixture of exponential families and derive the asymptotic form of the stochastic complexities. We show that the stochastic complexities become smaller than those of regular statistical models, which implies the advantage of the Bayesian learning still remains in the Variational Bayesian learning. Stochastic complexity, which is called the marginal likelihood or the free energy, not only becomes important in addressing the model selection problem but also enables us to discuss the accuracy of the Variational Bayesian approach as an approximation of the true Bayesian learning.
机译:变分贝叶斯学习,提出作为贝叶斯学习的近似,在许多应用中提供了计算途径和良好的泛化性能。但是,很少有待调查其理论属性。在本文中,我们讨论了指数家庭混合物的变分别贝叶斯学习,并导出随机复杂性的渐近形式。我们表明随机复杂性比常规统计模型的差异小,这意味着贝叶斯学习的优势仍然存在于变分贝叶斯学习中。随机复杂性被称为边缘似然或自由能,而不仅在解决模型选择问题方面变得重要,而且还使我们能够讨论变分贝叶斯方法的准确性,作为真正的贝叶斯学习的近似。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号