首页> 外文期刊>Soft computing: A fusion of foundations, methodologies and applications >Improving predictive uncertainty estimation using Dropout-Hamiltonian Monte Carlo
【24h】

Improving predictive uncertainty estimation using Dropout-Hamiltonian Monte Carlo

机译:使用辍学 - 哈密顿蒙特卡罗提高预测不确定性估计

获取原文
获取原文并翻译 | 示例
       

摘要

Estimating predictive uncertainty is crucial for many computer vision tasks, from image classification to autonomous driving systems. Hamiltonian Monte Carlo (HMC) is an sampling method for performing Bayesian inference. On the other hand, Dropout regularization has been proposed as an approximate model averaging technique that tends to improve generalization in large-scale models such as deep neural networks. Although HMC provides convergence guarantees for most standard Bayesian models, it do not handle discrete parameters arising from Dropout regularization. In this paper, we present a robust methodology for improving predictive uncertainty in classification problems, based on Dropout and HMC. Even though Dropout induces a non-smooth energy function with no such convergence guarantees, the resulting discretization of the Hamiltonian proves empirical success. The proposed method allows to effectively estimate the predictive accuracy and to provide better generalization for difficult test examples.
机译:估算预测性不确定性对于许多计算机视觉任务来说至关重要,从图像分类到自动驾驶系统。 Hamiltonian Monte Carlo(HMC)是用于执行贝叶斯推断的采样方法。另一方面,已经提出了辍学正则化作为近似模型平均技术,其倾向于改善大型模型中的概括,例如深度神经网络。虽然HMC为大多数标准贝叶斯模型提供了收敛保证,但它不会处理从辍学正常化引起的离散参数。在本文中,我们基于辍学和HMC提出了一种提高分类问题预测性不确定性的鲁棒方法。尽管辍学率引起不平滑的能量功能,但没有这种收敛保证,汉密尔顿人的由此产生的离散化证明了经验成功。所提出的方法允许有效地估计预测精度,并为困难测试示例提供更好的概括。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号