首页> 外文会议>International Work-Conference on Artificial Neural Networks >Exploiting Multitask Learning Schemes Using Private Subnetworks
【24h】

Exploiting Multitask Learning Schemes Using Private Subnetworks

机译:利用私有子网进行多任务学习计划

获取原文

摘要

Many problems in pattern recognition are focused to learn one main task, Single Task Learning (STL). However, most of them can be formulated from learning several tasks related to the main task at the same time while using a shared representation, Multitask Learning (MTL). In this paper, a new MLT architecture is proposed and its performance is compared with those obtained from other previous schemes used in MTL. This new MTL scheme makes use of private subnetworks to induce a bias in the learning process. The results provided from artificial and real data sets show how the use of this private subnetworks in MTL produces a better generalization capabilities and a faster learning.
机译:模式识别中的许多问题都集中于学习一个主要任务,单个任务学习(STL)。然而,大多数可以在使用共享表示的同时学习与主要任务相关的多个任务,同时使用共享表示,多任务学习(MTL)。在本文中,提出了一种新的MLT架构,并将其性能与来自MTL中使用的其他先前方案获得的性能进行比较。这种新的MTL方案利用私有子网来引起学习过程中的偏差。从人工和真实数据集提供的结果展示了如何在MTL中使用此私有子网产生更好的泛化能力和更快的学习。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号