首页> 外文会议>International Conference on Machine Learning >Flexible Modeling of Latent Task Structures in Multitask Learning
【24h】

Flexible Modeling of Latent Task Structures in Multitask Learning

机译:多任务学习中潜在任务结构的灵活建模

获取原文

摘要

Multitask learning algorithms are typically designed assuming some fixed, a priori known latent structure shared by all the tasks. However, it is usually unclear what type of latent task structure is the most appropriate for a given multitask learning problem. Ideally, the "right" latent task structure should be learned in a data-driven manner. We present a flexible, nonparametric Bayesian model that posits a mixture of factor analyzers structure on the tasks. The nonparametric aspect makes the model expressive enough to subsume many existing models of latent task structures (e.g, mean-regularized tasks, clustered tasks, low-rank or linear/non-linear subspace assumption on tasks, etc.). Moreover, it can also learn more general task structures, addressing the shortcomings of such models. We present a variational inference algorithm for our model. Experimental results on synthetic and real-world datasets, on both regression and classification problems, demonstrate the effectiveness of the proposed method.
机译:通常设计多任务学习算法假设某些固定,并通过所有任务共享的先验已知的潜在结构。但是,通常不清楚给定多任务学习问题最合适的潜在任务结构是什么类型的潜在任务结构。理想情况下,应以数据驱动的方式学习“正确的”潜在任务结构。我们提出了一种灵活的非参数贝叶斯模型,可以在任务上定位因子分析仪结构的混合。非参数方面使模型表现得足够了,可以占用许多现有的潜在任务结构模型(例如,均值正常化任务,群集任务,低级或线性/非线性子空间在任务等)。此外,还可以了解更多的一般任务结构,解决这些模型的缺点。我们为我们的模型提出了一种变分推理算法。在回归和分类问题上的合成和现实世界数据集上的实验结果证明了该方法的有效性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号