首页> 外文会议>Annual meeting of the Association for Computational Linguistics >Modeling Coverage for Neural Machine Translation
【24h】

Modeling Coverage for Neural Machine Translation

机译:神经机器翻译的建模范围

获取原文

摘要

Attention mechanism has enhanced state-of-the-art Neural Machine Translation (NMT) by jointly learning to align and translate. It tends to ignore past alignment information, however, which often leads to over-translation and under-translation. To address this problem, we propose coverage-based NMT in this paper. We maintain a coverage vector to keep track of the attention history. The coverage vector is fed to the attention model to help adjust future attention, which lets NMT system to consider more about untranslated source words. Experiments show that the proposed approach significantly improves both translation quality and alignment quality over standard attention-based NMT.
机译:注意机制通过共同学习对齐和翻译,增强了最新的神经机器翻译(NMT)。但是,它倾向于忽略过去的对齐信息,这通常会导致翻译过度和翻译不足。为了解决这个问题,我们在本文中提出了基于覆盖率的NMT。我们维护一个覆盖向量,以跟踪关注历史记录。覆盖向量被馈送到注意力模型,以帮助调整未来的注意力,这使NMT系统可以更多地考虑未翻译的源词。实验表明,与标准的基于注意力的NMT相比,该方法可显着提高翻译质量和对齐质量。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号