首页> 外文会议>International Conference on Machine Learning >The k-tied Normal Distribution: A Compact Parameterization of Gaussian Mean Field Posteriors in Bayesian Neural Networks
【24h】

The k-tied Normal Distribution: A Compact Parameterization of Gaussian Mean Field Posteriors in Bayesian Neural Networks

机译:K绑定的正态分布:贝叶斯神经网络的高斯平均田间外文的紧凑参数化

获取原文

摘要

Variational Bayesian Inference is a popular methodology for approximating posterior distributions over Bayesian neural network weights. Recent work developing this class of methods has explored ever richer parameterizations of the approximate posterior in the hope of improving performance. In contrast, here we share a curious experimental finding that suggests instead restricting the variational distribution to a more compact parameterization. For a variety of deep Bayesian neural networks trained using Gaussian mean-field variational inference, we find that the posterior standard deviations consistently exhibit strong low-rank structure after convergence. This means that by decomposing these variational parameters into a low-rank factorization, we can make our variational approximation more compact without decreasing the models' performance. Furthermore, we find that such factorized parameterizations improve the signal-to-noise ratio of stochastic gradient estimates of the variational lower bound, resulting in faster convergence.
机译:变形贝叶斯推理是一种流行的方法,用于近似贝叶斯神经网络权重的后部分布。最近的工作开发这类方法已经探讨了在希望提高性能的希望中探讨了近似后的近似后的参数。相比之下,在这里,我们分享了一个奇怪的实验发现,而是建议将变分分配限制为更紧凑的参数化。对于使用高斯均值场变分的各种深度贝叶斯神经网络,我们发现后标准偏差在收敛后始终如一地表现出强烈的低级别结构。这意味着通过将这些变分参数分解成低级别分子,我们可以使我们的变分近似更加紧凑而不会降低模型的性能。此外,我们发现这种分解参数化提高了变分的下界的随机梯度估计的信噪比,从而更快地收敛。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号