【24h】

An Investigation on Statistical Machine Translation with Neural Language Models

机译:神经语言模型对统计机器翻译的研究

获取原文

摘要

Recent work has shown the effectiveness of neural probabilistic language models(NPLMs) in statistical machine translation(SMT) through both reranking the n-best outputs and direct decoding. However there are still some issues remained for application of NPLMs. In this paper we further investigate through detailed experiments and extension of state-of-art NPLMs. Our experiments on large-scale datasets show that our final setting, i.e., decoding with conventional n-gram LMs plus un-normalized feedforward NPLMs extended with word clusters could significantly improve the translation performance by up to averaged 1.1 Bleu on four test datasets, while decoding time is acceptable. And results also show that current NPLMs, including feedforward and RNN still cannot simply replace n-gram LMs for SMT.
机译:最近的工作已经通过对n个最佳输出进行排名和直接解码来证明神经概率语言模型(NPLM)在统计机器翻译(SMT)中的有效性。但是,NPLM的应用仍然存在一些问题。在本文中,我们将通过详细的实验和最先进的NPLM的扩展进行进一步的研究。我们在大规模数据集上的实验表明,我们的最终设置(即使用常规n-gram LM加上带有单词簇扩展的非标准化前馈NPLM进行解码)可以显着提高四个测试数据集的平均翻译平均1.1 Bleu的翻译性能,而解码时间是可以接受的。结果还表明,当前的NPLM(包括前馈和RNN)仍然不能简单地将S-gram替换为n-gram LM。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号