首页> 外文会议>Workshop on Domain Adaptation for NLP >Cross-Lingual Transfer with MAML on Trees
【24h】

Cross-Lingual Transfer with MAML on Trees

机译:用maml在树上交叉转移

获取原文

摘要

In meta-learning, the knowledge learned from previous tasks is transferred to new ones, but this transfer only works if tasks are related. Sharing information between unrelated tasks might hurt performance, and it is unclear how to transfer knowledge across tasks that have a hierarchical structure. Our research extends a meta-leaming model, MAML, by exploiting hierarchical task relationships. Our algorithm, TreeMAML, adapts the model to each task with a few gradient steps, but the adaptation follows the hierarchical tree structure: in each step, gradients are pooled across tasks clusters and subsequent steps follow down the tree. We also implement a clustering algorithm that generates the tasks tree without previous knowledge of the task structure, allowing us to make use of implicit relationships between the tasks. We show that TreeMAML successfully trains natural language processing models for cross-lingual Natural Language Inference by taking advantage of the language phylogenetic tree. This result is useful, since most languages in the world are under-resourced and the improvement on cross-lingual transfer allows the internationalization of NLP models.
机译:在元学习中,从先前任务中学到的知识被转移到新的,但如果任务相关,则此转移仅适用。在不相关任务之间共享信息可能会损害性能,并且目前尚不清楚如何在具有分层结构的任务中传输知识。我们的研究通过利用分层任务关系来扩展了Meta-LeaMing模型MAML。我们的算法TREMAML,使用几个渐变步骤将模型适应每个任务,但适应后遵循分层树结构:在每个步骤中,池中池池群组和后续步骤跟随树。我们还实现了一种聚类算法,可以生成任务树,而无需先前的任务结构知识,允许我们在任务之间使用隐式关系。我们展示了Treemaml通过利用语言系统发育树来成功地培训用于交叉语言自然语言推断的自然语言处理模型。这一结果很有用,因为世界上大多数语言都被资源,交叉转移的改善允许NLP模型的国际化。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号