首页> 外文期刊>Neural Networks: The Official Journal of the International Neural Network Society >Stochastic complexities of general mixture models in variational Bayesian learning.
【24h】

Stochastic complexities of general mixture models in variational Bayesian learning.

机译:贝叶斯变分学习中一般混合模型的随机复杂性。

获取原文
获取原文并翻译 | 示例
获取外文期刊封面目录资料

摘要

In this paper, we focus on variational Bayesian learning of general mixture models. Variational Bayesian learning was proposed as an approximation of Bayesian learning. While it has provided computational tractability and good generalization in many applications, little has been done to investigate its theoretical properties. The asymptotic form was obtained for the stochastic complexity, or the free energy in the variational Bayesian learning of a mixture of exponential-family distributions, which is the main contribution this paper makes. We reveal that the stochastic complexities become smaller than those of regular statistical models, which implies that the advantages of Bayesian learning are still retained in variational Bayesian learning. Moreover, the derived bounds indicate what influence the hyperparameters have on the learning process, and the accuracy of the variational Bayesian approach as an approximation of true Bayesian learning.
机译:在本文中,我们专注于一般混合模型的变分贝叶斯学习。提出了变分贝叶斯学习作为贝叶斯学习的近似。尽管它在许多应用程序中都提供了计算上的可扩展性和良好的通用性,但几乎没有做任何研究来研究其理论特性。渐近形式是针对随机复杂性或指数族分布的混合贝叶斯学习中的自由能而获得的,这是本文所做的主要贡献。我们发现随机复杂度变得小于常规统计模型,这意味着贝叶斯学习的优势仍保留在变分贝叶斯学习中。此外,导出的边界表明超参数对学习过程有什么影响,以及作为真实贝叶斯学习近似的变分贝叶斯方法的准确性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号