首页> 外文会议>Workshop on south and southeast Asian NLP >A study of attention-based Neural Machine Translation models on Indian Languages
【24h】

A study of attention-based Neural Machine Translation models on Indian Languages

机译:基于注意力的印度语言神经机器翻译模型研究

获取原文

摘要

Neural machine translation (NMT) models have recently been shown to be very successful in machine translation (MT). The use of LSTMs in machine translation has significantly improved the translation performance for longer sentences by being able to capture the context and long range correlations of the sentences in their hidden layers. The attention model based NMT system has become state-of-the-art, performing equal or better than other statistical MT approaches. In this paper, we studied the performance of the attention-model based NMT system on the Indian language pair, Hindi and Bengali. We analysed the types of errors that occur in morphologically rich languages when there is a scarcity of large parallel training corpus. We then carried out certain post-processing heuristic steps to improve the quality of the translated statements and suggest further measures.
机译:最近已经证明了神经机器翻译(NMT)模型在机器翻译(MT)中非常成功。通过在机器翻译中使用LSTM,可以捕获隐藏层中句子的上下文和远距离相关性,从而显着提高了较长句子的翻译性能。基于注意力模型的NMT系统已经成为最新技术,其性能与其他统计MT方法相同或更好。在本文中,我们研究了基于注意力模型的NMT系统在印度语对(印地语和孟加拉语)上的性能。当缺乏大型并行训练语料库时,我们分析了形态丰富的语言中发生的错误的类型。然后,我们执行了某些后处理启发式步骤,以提高翻译后的语句的质量并提出进一步的建议。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号