首页> 外文会议>Machine translation summit >Debiasing Word Embeddings Improves Multimodal Machine Translation
【24h】

Debiasing Word Embeddings Improves Multimodal Machine Translation

机译:对词嵌入进行去偏置可改善多模式机器翻译

获取原文

摘要

In recent years, pretrained word embeddings have proved useful for multimodal neural machine translation (NMT) models to address the shortage of available datasets. However, the integration of pretrained word embeddings has not yet been explored extensively. Further, pretrained word embeddings in high dimensional spaces have been reported to suffer from the hubness problem. Although some debiasing techniques have been proposed to address this problem for other natural language processing tasks, they have seldom been studied for multimodal NMT models. In this study, we examine various kinds of word embeddings and introduce two debiasing techniques for three multimodal NMT models and two language pairs — English-German translation and English-French translation. With our optimal settings, the overall performance of multimodal models was improved by up to +1.62 BLEU and +1.14 METEOR for English—German translation and +1.40 BLEU and +1.13 METEOR for English-French translation.
机译:近年来,预训练词嵌入已被证明可用于解决多模态神经机器翻译(NMT)模型,以解决可用数据集的不足。但是,尚未广泛探索预训练单词嵌入的集成。此外,已经报道了在高维空间中的预训练词嵌入遭受中心性问题。尽管已经提出了一些消除偏见的技术来解决其他自然语言处理任务的问题,但很少对多模态NMT模型进行研究。在这项研究中,我们研究了各种词嵌入,并针对三种多模式NMT模型和两种语言对(英语-德语翻译和英语-法语翻译)介绍了两种去偏技术。通过我们的最佳设置,英语-德语翻译的多峰模型的整体性能提高了+1.62 BLEU和+1.14 METEOR,英语-法语翻译的+1.40 BLEU和+1.13 METEOR。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号