...
首页> 外文期刊>SIAM/ASA Journal on Uncertainty Quantification >On Negative Transfer and Structure of Latent Functions in Multioutput Gaussian Processes
【24h】

On Negative Transfer and Structure of Latent Functions in Multioutput Gaussian Processes

机译:多输出高斯过程中潜在函数的负传递与结构

获取原文
获取原文并翻译 | 示例

摘要

The multioutput Gaussian process (MGP) is based on the assumption that outputs share commonalities; however, if this assumption does not hold, negative transfer will lead to decreased performance relative to learning outputs independently or in subsets. In this article, we first define negative transfer in the context of MGP and then derive necessary conditions for an MGP model to avoid negative transfer. Specifically, under the convolution construction, we show that avoiding negative transfer is mainly dependent on having a sufficient number of latent functions Q regardless of the flexibility of the kernel or inference procedure used. However, a slight increase in Q leads to a large increase in the number of parameters to be estimated. To this end, we propose two latent structures which can scale to arbitrarily large datasets, can avoid negative transfer, and allow any kernel or sparse approximations to be used within. We also show that these structures allow regularization which can provide automatic selection of related outputs.
机译:multioutput高斯过程(MGP)是基于假设输出共享共性;然而,如果这种假设并不持有,负迁移会导致下降学习性能相对于输出独立或子集。首先定义上下文中的负迁移MGP然后推导的必要条件MGP模型避免负迁移。具体来说,在卷积建设,我们表明,避免负迁移主要是依赖于拥有足够数量的潜伏函数Q不管的灵活性内核或推理过程使用。问轻微的增加会导致大量增加参数估计的数量。最终,我们可以提出两个潜在的结构扩展到任意大型数据集,可以避免负迁移,并允许任何内核或稀疏近似中使用。这些结构允许正则化可以提供自动选择相关的输出。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号