首页> 外文会议>International conference on Asian language processing >Recursive annotations for attention-based neural machine translation
【24h】

Recursive annotations for attention-based neural machine translation

机译:基于注意力的神经机器翻译的递归注释

获取原文

摘要

The last few years have witnessed the success of attention-based Neural Machine Translation (NMT), and many of variant models have been used to improve the performance. Most of the proposed attention-based NMT models encode the source sentence into a sequence of annotations which are kept fixed for the following steps. In this paper, we conjecture that the use of fixed annotations is the bottleneck in improving the performance ofconventional attention-based NMT. To tackle this shortcoming, we propose a novel model for attention-based NMT, which is intended to update the source annotations recursively when generating the target word at each time step. Experimental results show that the proposed approach achieves significant performance improvement over multiple test sets.
机译:最近几年见证了基于注意力的神经机器翻译(NMT)的成功,并且使用了许多变体模型来提高性能。大多数建议的基于注意力的NMT模型将源句子编码为一系列注释,这些注释在以下步骤中保持不变。在本文中,我们推测使用固定注释是提高常规基于注意的NMT性能的瓶颈。为了解决这个缺点,我们为基于注意力的NMT提出了一种新颖的模型,该模型旨在在每个时间步生成目标词时递归更新源注释。实验结果表明,所提出的方法在多个测试集上均实现了显着的性能改进。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号