首页> 外文会议>Annual meeting of the Association for Computational Linguistics >A Multi-Task Architecture on Relevance-based Neural Query Translation
【24h】

A Multi-Task Architecture on Relevance-based Neural Query Translation

机译:基于相关性的神经查询翻译的多任务架构

获取原文

摘要

We describe a multi-task learning approach to train a Neural Machine Translation (NMT) model with a Relevance-based Auxiliary Task (RAT) for search query translation. The translation process for Cross-lingual Information Retrieval (CLIR) task is usually treated as a black box and it is performed as an independent step. However, an NMT model trained on sentence-level parallel data is not aware of the vocabulary distribution of the retrieval corpus. We address this problem with our multitask learning architecture that achieves 16% improvement over a strong NMT baseline on Italian-English query-document dataset. We show using both quantitative and qualitative analysis that our model generates balanced and precise translations with the regulariza-tion effect it achieves from multi-task learning paradigm.
机译:我们描述了一种多任务学习方法,用于训练神经机器翻译(NMT)模型和基于相关性的辅助任务(RAT)用于搜索查询翻译。跨语言信息检索(CLIR)任务的翻译过程通常被视为黑匣子,并且作为独立步骤执行。但是,在句子级并行数据上训练的NMT模型不知道检索语料库的词汇分布。我们通过多任务学习体系结构解决了这个问题,该体系结构在意大利语-英语查询文档数据集上比强大的NMT基准提高了16%。我们使用定量和定性分析表明,我们的模型可以生成均衡且精确的翻译,并具有从多任务学习范式中实现的正则化效果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号