首页> 外文会议>Conference on uncertainty in artificial intelligence >Parallel Gaussian Process Regression with Low-Rank Covariance Matrix Approximations
【24h】

Parallel Gaussian Process Regression with Low-Rank Covariance Matrix Approximations

机译:具有低秩协方差矩阵逼近的并行高斯过程回归

获取原文

摘要

Gaussian processes (GP) are Bayesian non-parametric models that are widely used for probabilistic regression. Unfortunately, it cannot scale well with large data nor perform real-time predictions due to its cubic time cost in the data size. This paper presents two parallel GP regression methods that exploit low-rank covariance matrix approximations for distributing the computational load among parallel machines to achieve time efficiency and scalability. We theoretically guarantee the predictive performances of our proposed parallel GPs to be equivalent to that of some centralized approximate GP regression methods: The computation of their centralized counterparts can be distributed among parallel machines, hence achieving greater time efficiency and scalability. We analytically compare the properties of our parallel GPs such as time, space, and communication complexity. Empirical evaluation on two real-world datasets in a cluster of 20 computing nodes shows that our parallel GPs are significantly more time-efficient and scalable than their centralized counterparts and exact/full GP while achieving predictive performances comparable to full GP.
机译:高斯过程(GP)是广泛用于概率回归的贝叶斯非参数模型。不幸的是,由于数据量的立方时间成本,它无法很好地扩展大数据,也无法执行实时预测。本文提出了两种并行的GP回归方法,这些方法利用低秩协方差矩阵近似在并行机之间分配计算负荷,从而实现时间效率和可伸缩性。从理论上讲,我们保证所提出的并行GP的预测性能与某些集中式近似GP回归方法的性能相同:集中式对应GP的计算可以在并行机之间进行分配,从而实现更高的时间效率和可伸缩性。我们分析性地比较了并行GP的属性,例如时间,空间和通信复杂性。对一个由20个计算节点组成的集群中的两个实际数据集进行的经验评估表明,我们的并行GP的时间效率和可伸缩性要比其集中式对应的GP和精确/完全GP显着提高,同时实现了与完全GP相当的预测性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号