首页> 外文期刊>JMLR: Workshop and Conference Proceedings >Sequential Neural Likelihood: Fast Likelihood-free Inference with Autoregressive Flows
【24h】

Sequential Neural Likelihood: Fast Likelihood-free Inference with Autoregressive Flows

机译:顺序神经似然性:具有自回归流的快速无似然推理

获取原文
       

摘要

We present Sequential Neural Likelihood (SNL), a new method for Bayesian inference in simulator models, where the likelihood is intractable but simulating data from the model is possible. SNL trains an autoregressive flow on simulated data in order to learn a model of the likelihood in the region of high posterior density. A sequential training procedure guides simulations and reduces simulation cost by orders of magnitude. We show that SNL is more robust, more accurate and requires less tuning than related neural-based methods, and we discuss diagnostics for assessing calibration, convergence and goodness-of-fit.
机译:我们提出了顺序神经似然(SNL),这是一种在模拟器模型中进行贝叶斯推理的新方法,其中似然性难以控制,但可以模拟来自模型的数据。 SNL在模拟数据上训练自回归流,以学习后验高密度区域中的似然性模型。顺序培训程序指导了仿真,并将仿真成本降低了几个数量级。我们显示SNL比相关的基于神经的方法更强大,更准确并且需要的调整更少,并且我们讨论了用于评估校准,收敛和拟合优度的诊断。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号