首页> 外文期刊>Pattern recognition letters >Attention Mechanism Based Mixture of Gaussian Processes
【24h】

Attention Mechanism Based Mixture of Gaussian Processes

机译:Attention Mechanism Based Mixture of Gaussian Processes

获取原文
获取原文并翻译 | 示例
           

摘要

The mixture of Gaussian processes (MGP) is a powerful model, which is able to characterize data gen-erated by a general stochastic process. However, conventional MGPs assume the input variable obeys certain probabilistic distribution, thus cannot effectively handle the case where the input variable lies on a general manifold or a graph. In this paper, we first clarify the relationship between the MGP prediction strategy and the attention mechanism. Based on the attention mechanism, we further design two novel mixture models of Gaussian processes, which do not rely on probabilistic assumptions on the input do-main, thus overcoming the difficulty of extending MGP models to manifold or graph. Experimental results on real-world datasets demonstrate the effectiveness of the proposed methods.(c) 2022 Elsevier B.V. All rights reserved.

著录项

获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号