首页> 外文会议>International conference on neural information processing;ICONIP'96 >Reference Priors for Neural Networks: Laplace versus Gaussian
【24h】

Reference Priors for Neural Networks: Laplace versus Gaussian

机译:神经网络的参考先验:拉普拉斯与高斯

获取原文

摘要

Motivated by the principle of maximum entropy, the Laplace prior has been introduced in the Bayesian inference approach to training feedforward neural networks as the prior distribution for network weights. In this paper, we examine in detail the arguments supporting the Laplace prior and argue that network weights, being continuous quantities with no obvious upper bound, complicate the application of the maximum entropy principle. On the other hand, we motivate the use of the Gaussian prior as reference prior in neural networks, basing on the assumptions of exchangeability and spherical symmetry.
机译:受最大熵原理的激励,在贝叶斯推理方法中引入了拉普拉斯先验知识,以训练前馈神经网络作为网络权重的先验分布。在本文中,我们详细研究了支持拉普拉斯先验的参数,并认为网络权重是没有明显上限的连续量,使最大熵原理的应用变得复杂。另一方面,基于可交换性和球对称性的假设,我们鼓励在神经网络中使用高斯先验作为参考先验。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号