首页> 外文会议>International Conference on Machine Learning and Cybernetics >ASYMPTOTIC NORMALITY OF POSTERIOR IN CONSISTENT BAYESIAN LEARNING
【24h】

ASYMPTOTIC NORMALITY OF POSTERIOR IN CONSISTENT BAYESIAN LEARNING

机译:一致贝叶斯学习后后岩的渐近常态

获取原文

摘要

This paper presents the study of asymptotic normality of posterior in Bayesian Learning from the point of view of Computational Learning Theory. Three reduced regular conditions, which are more convenient to be used than Walker's and Heyde's, are presented. The theorem, which shows under certain regular condition the poster distribution is not only consistent, but also approximately normal, is proved. Since the computation of normal distribution is relatively simpler, then the results can become the theoretic foundation for further study, for instance, assigning resultful prior distribution and simplifying the computation in Bayesian learning.
机译:本文从计算学习理论看贝叶斯学习后渐近常态的研究。三个常规条件减少,比Walker的和Heyde更方便。定理显示在某些规则状态下,海报分布不仅是一致的,而且还达到正常。由于正常分布的计算相对更简单,因此结果可以成为进一步研究的理论基础,例如,分配结果的先前分配并简化贝叶斯学习中的计算。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号