首页> 外文会议>International Conference on Computational Linguistics >Knowledge Graph Enhanced Neural Machine Translation via Multi-task Learning on Sub-entity Granularity
【24h】

Knowledge Graph Enhanced Neural Machine Translation via Multi-task Learning on Sub-entity Granularity

机译:知识图通过对子实体粒度的多任务学习增强了神经电脑平移

获取原文

摘要

Previous studies combining knowledge graph (KG) with neural machine translation (NMT) have two problems: ⅰ) Knowledge under-utilization: they only focus on the entities that appear in both KG and training sentence pairs, making much knowledge in KG unable to be fully utilized, ⅱ) Granularity mismatch: the current KG methods utilize the entity as the basic granularity, while NMT utilizes the sub-word as the granularity, making the KG different to be utilized in NMT. To alleviate above problems, we propose a multi-task learning method on sub-entity granularity. Specifically, we first split the entities in KG and sentence pairs into sub-entity granularity by using joint BPE. Then we utilize the multi-task learning to combine the machine translation task and knowledge reasoning task. The extensive experiments on various translation tasks have demonstrated that our method significantly outperforms the baseline models in both translation quality and handling the entities.
机译:以前的研究将知识图(KG)与神经机翻译(NMT)结合起来有两个问题:Ⅰ)知识不足:它们仅关注KG和培训句对中出现的实体,使KG无法成为知识 充分利用,Ⅱ)粒度不匹配:当前的KG方法利用实体作为基本粒度,而NMT利用子字作为粒度,使得kg不同的kg不同以在nmt中使用。 为了缓解上述问题,我们提出了一种关于子实体粒度的多任务学习方法。 具体地,我们首先通过使用关节BPE将kg和句子对的实体分成子实体粒度。 然后我们利用多任务学习来组合机器翻译任务和知识推理任务。 对各种翻译任务的广泛实验表明,我们的方法在翻译质量和处理实体中显着优于基线模型。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号