...
首页> 外文期刊>JMLR: Workshop and Conference Proceedings >High-Quality Prediction Intervals for Deep Learning: A Distribution-Free, Ensembled Approach
【24h】

High-Quality Prediction Intervals for Deep Learning: A Distribution-Free, Ensembled Approach

机译:深度学习的高质量预测间隔:无分布,合奏的方法

获取原文
           

摘要

This paper considers the generation of prediction intervals (PIs) by neural networks for quantifying uncertainty in regression tasks. It is axiomatic that high-quality PIs should be as narrow as possible, whilst capturing a specified portion of data. We derive a loss function directly from this axiom that requires no distributional assumption. We show how its form derives from a likelihood principle, that it can be used with gradient descent, and that model uncertainty is accounted for in ensembled form. Benchmark experiments show the method outperforms current state-of-the-art uncertainty quantification methods, reducing average PI width by over 10%.
机译:本文认为神经网络的生成(用于量化回归任务中的不确定性。它是公理的,即高质量的PIS应尽可能窄,同时捕获指定的数据部分。我们直接从此公理中获得损失函数,这是不需要分布假设的。我们展示了其形式如何从似然原则中得出,它可以与梯度下降一起使用,并且该模型不确定性被组成的形式被占用。基准实验表明该方法优于最新的最先进的不确定性定量方法,将平均PI宽度降低超过10%。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号