...
首页> 外文期刊>JMLR: Workshop and Conference Proceedings >Fast and Scalable Bayesian Deep Learning by Weight-Perturbation in Adam
【24h】

Fast and Scalable Bayesian Deep Learning by Weight-Perturbation in Adam

机译:通过权重扰动在亚当中进行快速和可扩展的贝叶斯深度学习

获取原文
   

获取外文期刊封面封底 >>

       

摘要

Uncertainty computation in deep learning is essential to design robust and reliable systems. Variational inference (VI) is a promising approach for such computation, but requires more effort to implement and execute compared to maximum-likelihood methods. In this paper, we propose new natural-gradient algorithms to reduce such efforts for Gaussian mean-field VI. Our algorithms can be implemented within the Adam optimizer by perturbing the network weights during gradient evaluations, and uncertainty estimates can be cheaply obtained by using the vector that adapts the learning rate. This requires lower memory, computation, and implementation effort than existing VI methods, while obtaining uncertainty estimates of comparable quality. Our empirical results confirm this and further suggest that the weight-perturbation in our algorithm could be useful for exploration in reinforcement learning and stochastic optimization.
机译:深度学习中的不确定性计算对于设计强大而可靠的系统至关重要。变分推理(VI)是进行此类计算的一种有前途的方法,但是与最大似然方法相比,需要更多的精力来实现和执行。在本文中,我们提出了新的自然梯度算法来减少高斯平均场VI的这种工作量。我们的算法可以在Adam优化器中通过在梯度评估过程中干扰网络权重来实现,而不确定性估计可以通过使用适应学习率的向量廉价地获得。与现有的VI方法相比,这需要较少的内存,计算和实现工作,同时可以获得可比较质量的不确定性估计。我们的经验结果证实了这一点,并进一步表明我们算法中的权重扰动对于探索强化学习和随机优化可能有用。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号