首页> 外文会议>European conference on machine learning and knowledge discovery in databases >Gaussian Process Multi-task Learning Using Joint Feature Selection
【24h】

Gaussian Process Multi-task Learning Using Joint Feature Selection

机译:使用联合特征选择的高斯过程多任务学习

获取原文

摘要

Multi-task learning involves solving multiple related learning problems by sharing some common structure for improved generalization performance. A promising idea to multi-task learning is joint feature selection where a sparsity pattern is shared across task specific feature representations. In this paper, we propose a novel Gaussian Process (GP) approach to multi-task learning based on joint feature selection. The novelty of the proposed approach is that it captures the task similarity by sharing a sparsity pattern over the kernel hyper-parameters associated with each task. This is achieved by considering a hierarchical model which imposes a multi-Laplacian prior over the kernel hyper-parameters. This leads to a flexible GP model which can handle a wide range of multi-task learning problems and can identify features relevant across all the tasks. The hyper-parameter estimation results in an optimization problem which is solved using a block co-ordinate descent algorithm. Experimental results on synthetic and real world multi-task learning data sets demonstrate that the flexibility of the proposed model is useful in getting better generalization performance.
机译:多任务学习涉及通过共享一些通用结构来解决多种相关的学习问题,从而提高泛化性能。多任务学习的一个有前途的想法是联合特征选择,其中在特定于任务的特征表示中共享稀疏模式。在本文中,我们提出了一种基于联合特征选择的新颖的高斯过程(GP)多任务学习方法。所提出的方法的新颖性在于,它通过在与每个任务相关联的内核超参数上共享稀疏模式来捕获任务相似性。这是通过考虑在内核超参数上强加多拉普拉斯先验的分层模型来实现的。这导致了一个灵活的GP模型,该模型可以处理各种多任务学习问题,并且可以识别与所有任务相关的功能。超参数估计导致优化问题,该优化问题使用块坐标下降算法来解决。综合和现实世界多任务学习数据集的实验结果表明,所提出模型的灵活性对于获得更好的泛化性能很有用。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号