首页> 外文会议>Annual meeting of the Association for Computational Linguistics >Extracting Multiple-Relations in One-Pass with Pre-Trained Transformers
【24h】

Extracting Multiple-Relations in One-Pass with Pre-Trained Transformers

机译:使用预训练的变压器在一次通过中提取多个关系

获取原文

摘要

The state-of-the-art solutions for extracting multiple entity-relations from an input paragraph always require a multiple-pass encoding on the input. This paper proposes a new solution that can complete the multiple entity-relations extraction task with only one-pass encoding on the input corpus, and achieve a new state-of-the-art accuracy performance, as demonstrated in the ACE 2005 benchmark. Our solution is built on top of the pre-trained self-attentive models (Transformer). Since our method uses a single-pass to compute all relations at once, it scales to larger datasets easily; which makes it more usable in real-world applications.~1
机译:从输入段落中提取多个实体关系的最新解决方案始终需要对输入进行多次编码。本文提出了一种新的解决方案,该解决方案只需对输入语料库进行一次遍历编码就可以完成多实体关系提取任务,并达到了最新的最新准确度性能,如ACE 2005基准所证明的那样。我们的解决方案基于预先训练的自我专注模型(Transformer)。由于我们的方法使用一次遍历一次计算所有关系,因此可以轻松扩展到更大的数据集;使其在实际应用中更有用。〜1

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号