首页> 外文会议>Conference on empirical methods in natural language processing >A Binarized Neural Network Joint Model for Machine Translation
【24h】

A Binarized Neural Network Joint Model for Machine Translation

机译:机器翻译的二进制神经网络联合模型

获取原文

摘要

The neural network joint model (NNJM), which augments the neural network language model (NNLM) with an m-word source context window, has achieved large gains in machine translation accuracy, but also has problems with high normalization cost when using large vocabularies. Training the NNJM with noise-contrastive estimation (NCE), instead of standard maximum likelihood estimation (MLE), can reduce computation cost. In this paper, we propose an alternative to NCE, the binarized NNJM (BNNJM), which learns a binary classifier that takes both the context and target words as input, and can be efficiently trained using MLE. We compare the BNNJM and NNJM trained by NCE on various translation tasks.
机译:神经网络联合模型(NNJM)通过m字源上下文窗口增强了神经网络语言模型(NNLM),在机器翻译准确性方面取得了很大的进步,但是在使用大词汇量时还存在标准化成本高的问题。用噪声对比估计(NCE)而不是标准最大似然估计(MLE)训练NNJM可以降低计算成本。在本文中,我们提出了NCE的另一种选择,即二值化NNJM(BNNJM),它学习了将上下文词和目标词都作为输入的二进制分类器,并且可以使用MLE有效地对其进行训练。我们比较了NCE在各种翻译任务上训练的BNNJM和NNJM。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号