首页> 美国卫生研究院文献>Entropy >Bayesian Uncertainty Quantification with Multi-Fidelity Data and Gaussian Processes for Impedance Cardiography of Aortic Dissection
【2h】

Bayesian Uncertainty Quantification with Multi-Fidelity Data and Gaussian Processes for Impedance Cardiography of Aortic Dissection

机译:具有多保真数据和高斯流程对主动脉抑制的多保真数据和高斯过程的不确定性定量

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

In 2000, Kennedy and O’Hagan proposed a model for uncertainty quantification that combines data of several levels of sophistication, fidelity, quality, or accuracy, e.g., a coarse and a fine mesh in finite-element simulations. They assumed each level to be describable by a Gaussian process, and used low-fidelity simulations to improve inference on costly high-fidelity simulations. Departing from there, we move away from the common non-Bayesian practice of optimization and marginalize the parameters instead. Thus, we avoid the awkward logical dilemma of having to choose parameters and of neglecting that choice’s uncertainty. We propagate the parameter uncertainties by averaging the predictions and the prediction uncertainties over all the possible parameters. This is done analytically for all but the nonlinear or inseparable kernel function parameters. What is left is a low-dimensional and feasible numerical integral depending on the choice of kernels, thus allowing for a fully Bayesian treatment. By quantifying the uncertainties of the parameters themselves too, we show that “learning” or optimising those parameters has little meaning when data is little and, thus, justify all our mathematical efforts. The recent hype about machine learning has long spilled over to computational engineering but fails to acknowledge that machine learning is a big data problem and that, in computational engineering, we usually face a little data problem. We devise the fully Bayesian uncertainty quantification method in a notation following the tradition of E.T. Jaynes and find that generalization to an arbitrary number of levels of fidelity and parallelisation becomes rather easy. We scrutinize the method with mock data and demonstrate its advantages in its natural application where high-fidelity data is little but low-fidelity data is not. We then apply the method to quantify the uncertainties in finite element simulations of impedance cardiography of aortic dissection. Aortic dissection is a cardiovascular disease that frequently requires immediate surgical treatment and, thus, a fast diagnosis before. While traditional medical imaging techniques such as computed tomography, magnetic resonance tomography, or echocardiography certainly do the job, Impedance cardiography too is a clinical standard tool and promises to allow earlier diagnoses as well as to detect patients that otherwise go under the radar for too long.
机译:2000年,肯尼迪和奥哈拉提出了一种不确定量化的模型,这些模型结合了多个层次的复杂性,保真度,质量或准确性的数据,例如,有限元模拟中的粗糙和细网。它们假设每个级别都可以通过高斯过程可以描述,并使用低保真仿真来改善昂贵的高保真仿造的推论。从那里出发,我们远离普通的非贝叶斯的优化实践,并使参数边缘化。因此,我们避免了必须选择参数的尴尬逻辑困境以及忽视该选择的不确定性。我们通过对所有可能的参数进行预测和预测不确定性来传播参数不确定性。这是针对所有非线性或不可分割的内核函数参数进行分析完成的。剩下的是,根据核的选择,剩下的低维和可行的数值积分,从而允许完全贝叶斯治疗。通过量化参数本身的不确定性,我们表明“学习”或优化这些参数时,当数据很少时,这些参数几乎没有意义,因此,为我们所有的数学努力证明了所有的数学努力。关于机器学习的最近炒作已经溢出到计算工程,但未能承认机器学习是一个大数据问题,并且在计算工程中,我们通常面临一点数据问题。我们在遵循e.t的传统之后的符号中设计了完全贝叶斯不确定性的定量方法。 jaynes并发现任意数量的保真度和平行度的概括变得相当容易。我们用模拟数据仔细检查方法,并在其自然应用中展示其优点,其中高保真数据很少但低保真数据不是。然后,我们应用该方法来量化主动脉夹层的阻抗心脏造影有限元模拟中的不确定性。主动脉夹层是一种经常需要立即外科治疗的心血管疾病,从而前所未有。虽然传统的医学成像技术如计算机断层扫描,磁共振断层摄影或超声心动图肯定是作业,但阻抗心脏造影也是临床标准工具,并承诺允许早期的诊断,并检测否则雷达下面的患者。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号