首页> 外文会议>Annual meeting of the Association for Computational Linguistics >A Multi-Task Architecture on Relevance-based Neural Query Translation
【24h】

A Multi-Task Architecture on Relevance-based Neural Query Translation

机译:基于相关的神经查询翻译的多任务架构

获取原文

摘要

We describe a multi-task learning approach to train a Neural Machine Translation (NMT) model with a Relevance-based Auxiliary Task (RAT) for search query translation. The translation process for Cross-lingual Information Retrieval (CLIR) task is usually treated as a black box and it is performed as an independent step. However, an NMT model trained on sentence-level parallel data is not aware of the vocabulary distribution of the retrieval corpus. We address this problem with our multitask learning architecture that achieves 16% improvement over a strong NMT baseline on Italian-English query-document dataset. We show using both quantitative and qualitative analysis that our model generates balanced and precise translations with the regulariza-tion effect it achieves from multi-task learning paradigm.
机译:我们描述了一种与基于相关的辅助任务(RAT)训练神经机翻译(NMT)模型的多任务学习方法,用于搜索查询翻译。用于交叉语言信息检索(CLIR)任务的翻译过程通常被视为黑匣子,并且它作为独立步骤进行。然而,在句子级并行数据上培训的NMT模型不知道检索语料库的词汇分布。我们通过多任务学习架构解决了这个问题,该架构在意大利语 - 英语查询文档数据集中实现了16%的强大NMT基线改进。我们展示了使用定量和定性分析,我们的模型通过常规提出的效果产生平衡和精确的翻译,它从多任务学习范式实现。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号