首页> 外文会议>International conference on computational linguistics >Refining Source Representations with Relation Networks for Neural Machine Translation
【24h】

Refining Source Representations with Relation Networks for Neural Machine Translation

机译:使用关系网络完善源表示以进行神经机器翻译

获取原文

摘要

Although neural machine translation with the encoder-decoder framework has achieved great success recently, it still suffers drawbacks of forgetting distant information, which is an inherent disadvantage of recurrent neural network structure, and disregarding relationship between source words during encoding step. Whereas in practice, the former information and relationship are often useful in current step. We target on solving these problems and thus introduce relation networks to learn better representations of the source. The relation networks arc able to facilitate memorization capability of recurrent neural network via associating source words with each other, this would also help retain their relationships. Then the source representations and all the relations are fed into the attention component together while decoding, with the main encoder-decoder framework unchanged. Experiments on several datasets show that our method can improve the translation performance significantly over the conventional encoder-decoder model and even outperform the approach involving supervised syntactic knowledge.
机译:尽管利用编码器-解码器框架进行的神经机器翻译最近取得了巨大的成功,但它仍然具有忘记远距离信息的缺点,这是递归神经网络结构的固有缺点,并且在编码步骤中无视源词之间的关系。实际上,在当前步骤中,以前的信息和关系通常很有用。我们的目标是解决这些问题,并因此引入关系网络以更好地表示源。关系网络能够通过将源词彼此关联来促进循环神经网络的记忆能力,这也将有助于保持它们之间的关系。然后在解码时将源表示和所有关系一起馈入关注组件,而主编码器-解码器框架不变。在多个数据集上进行的实验表明,与传统的编码器/解码器模型相比,我们的方法可以显着提高翻译性能,甚至优于涉及监督语法知识的方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号