【24h】

Learning Language Specific Sub-network for Multilingual Machine Translation

机译:学习语言多语言机器翻译子网

获取原文

摘要

Multilingual neural machine translation aims at learning a single translation model for multiple languages. These jointly trained models often suffer from performance degradation on rich-resource language pairs. We attribute this degeneration to parameter interference. In this paper, we propose LaSS to jointly train a single unified multilingual MT model. LaSS learns Language Specific Sub-network (LaSS) for each language pair to counter parameter interference. Comprehensive experiments on IWSLT and WMT datasets with various Transformer architectures show that LaSS obtains gains on 36 language pairs by up to 1.2 BLEU. Besides, LaSS shows its strong generalization performance at easy adaptation to new language pairs and zero-shot translation. LaSS boosts zero-shot translation with an average of 8.3 BLEU on 30 language pairs.
机译:多语种神经机平移旨在学习单一语言翻译模型。 这些联合训练有素的模型经常遭受富裕资源语言对的性能下降。 我们将这种退化归因于参数干扰。 在本文中,我们建议Lass联合训练一个统一的多语言MT模型。 LASS为每个语言对学习语言特定子网络(LASS)以计数参数干扰。 具有各种变压器架构的IWSLT和WMT数据集的综合实验表明,LAS在36语言对中获得最高1.2个BLEU的收益。 此外,Lass易于易于适应新的语言对和零拍翻译。 Lass在30个语言对中平均提升零拍平均值。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号