【24h】

Fast and Accurate Neural Machine Translation with Translation Memory

机译:快速准确的神经机翻译记忆库

获取原文

摘要

It is generally believed that a translation memory (TM) should be beneficial for machine translation tasks. Unfortunately, existing wisdom demonstrates the superiority of TM-based neural machine translation (NMT) only on the TM-specialized translation tasks rather than general tasks, with a non-negligible computational overhead. In this paper, we propose a fast and accurate approach to TM-based NMT within the Transformer framework: the model architecture is simple and employs a single bilingual sentence as its TM, leading to efficient training and inference; and its parameters are effectively optimized through a novel training criterion. Extensive experiments on six TM-specialized tasks show that the proposed approach substantially surpasses several strong baselines that use multiple TMs, in terms of BLEU and running time. In particular, the proposed approach also advances the strong baselines on two general tasks (WMT news Zh→En and En→De).
机译:通常认为翻译记忆库(TM)应该有利于机器翻译任务。 不幸的是,现有智慧仅在TM专业化的翻译任务中而不是一般任务的基于TM的神经电脑翻译(NMT)的优越性,具有不可忽略的计算开销。 在本文中,我们提出了一种快速准确的TM基于TM的NMT方法,在变压器框架内:模型架构简单,采用单个双语句子作为其TM,导致有效的培训和推理; 其参数通过新颖的训练标准有效优化。 关于六个TM专业任务的广泛实验表明,在BLEU和运行时间方面,所提出的方法大大超过了使用多个TMS的若干强的基线。 特别是,拟议的方法也在两个一般任务(WMT新闻ZH→EN和EN→DE)上推进强的基线。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号