首页> 外文期刊>Neural Computing and Applications >A multi-task framework for metric learning with common subspace
【24h】

A multi-task framework for metric learning with common subspace

机译:具有公共子空间的度量学习的多任务框架

获取原文
获取原文并翻译 | 示例

摘要

Metric learning has been widely studied in machine learning due to its capability to improve the performance of various algorithms. Meanwhile, multi-task learning usually leads to better performance by exploiting the shared information across all tasks. In this paper, we propose a novel framework to make metric learning benefit from jointly training all tasks. Based on the assumption that discriminative information is retained in a common subspace for all tasks, our framework can be readily used to extend many current metric learning methods. In particular, we apply our framework on the widely used Large Margin Component Analysis (LMCA) and yield a new model called multi-task LMCA. It performs remarkably well compared to many competitive methods. Besides, this method is able to learn a low-rank metric directly, which effects as feature reduction and enables noise compression and low storage. A series of experiments demonstrate the superiority of our method against three other comparison algorithms on both synthetic and real data.
机译:度量学习由于能够提高各种算法的性能而在机器学习中得到了广泛的研究。同时,多任务学习通常通过利用所有任务之间的共享信息来提高性能。在本文中,我们提出了一个新颖的框架,可以通过联合培训所有任务使度量学习受益。基于区分性信息被保留在所有任务的公共子空间中的假设,我们的框架可以轻松地用于扩展许多当前的度量学习方法。特别是,我们将我们的框架应用于广泛使用的大利润成分分析(LMCA),并产生了一个称为多任务LMCA的新模型。与许多竞争方法相比,它的性能非常好。此外,该方法能够直接学习低等级指标,从而降低特征量并实现噪声压缩和低存储量。一系列实验证明了我们的方法相对于其他三种针对合成数据和真实数据的比较算法的优越性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号