首页> 外文会议>Annual conference on Neural Information Processing Systems >Learning Multiple Tasks using Shared Hypotheses
【24h】

Learning Multiple Tasks using Shared Hypotheses

机译:使用共享假设学习多个任务

获取原文
获取外文期刊封面目录资料

摘要

In this work we consider a setting where we have a very large number of related tasks with few examples from each individual task. Rather than either learning each task individually (and having a large generalization error) or learning all the tasks together using a single hypothesis (and suffering a potentially large inherent error), we consider learning a small pool of shared hypotheses. Each task is then mapped to a single hypothesis in the pool (hard association). We derive VC dimension generalization bounds for our model, based on the number of tasks, shared hypothesis and the VC dimension of the hypotheses class. We conducted experiments with both synthetic problems and sentiment of reviews, which strongly support our approach.
机译:在这项工作中,我们考虑一种设置,其中我们有大量相关任务,而每个单独任务的示例很少。我们考虑使用一小部分共享假设,而不是单独学习每个任务(并具有较大的泛化误差)或使用单个假设一起学习所有任务(并遭受潜在的大固有误差)。然后将每个任务映射到池中的单个假设(硬关联)。基于任务数量,共享假设和假设类的VC维,我们为模型推导了VC维泛化边界。我们针对综合问题和评论情绪进行了实验,这强烈支持了我们的方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号