首页> 外文会议>European Conference on Machine Learning and Knowledge Discovery in Databases >Gaussian Process Multi-task Learning Using Joint Feature Selection
【24h】

Gaussian Process Multi-task Learning Using Joint Feature Selection

机译:高斯过程使用联合功能选择多任务学习

获取原文

摘要

Multi-task learning involves solving multiple related learning problems by sharing some common structure for improved generalization performance. A promising idea to multi-task learning is joint feature selection where a sparsity pattern is shared across task specific feature representations. In this paper, we propose a novel Gaussian Process (GP) approach to multi-task learning based on joint feature selection. The novelty of the proposed approach is that it captures the task similarity by sharing a sparsity pattern over the kernel hyper-parameters associated with each task. This is achieved by considering a hierarchical model which imposes a multi-Laplacian prior over the kernel hyper-parameters. This leads to a flexible GP model which can handle a wide range of multi-task learning problems and can identify features relevant across all the tasks. The hyper-parameter estimation results in an optimization problem which is solved using a block co-ordinate descent algorithm. Experimental results on synthetic and real world multi-task learning data sets demonstrate that the flexibility of the proposed model is useful in getting better generalization performance.
机译:多任务学习涉及通过共享一些常见结构来解决多种相关的学习问题,以改善泛化性能。对多任务学习的一个有希望的想法是共同特征选择,其中跨任务特定的特征表示共享稀疏模式。在本文中,我们提出了一种基于联合特征选择的多任务学习的新型高斯过程(GP)方法。所提出的方法的新颖之处在于它通过在与每个任务相关联的内核超参数上共享稀疏模式来捕获任务相似性。这是通过考虑在内核超参数之前施加多拉普利人的分层模型来实现的。这导致灵活的GP模型,可以处理广泛的多任务学习问题,并且可以识别所有任务中相关的功能。超参数估计导致使用块协调序列算法解决的优化问题。合成和现实世界多任务学习数据集的实验结果表明,所提出的模型的灵活性在获得更好的泛化性能方面是有用的。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号