首页> 外文会议>Annual meeting of the Association for Computational Linguistics >Extracting Multiple-Relations in One-Pass with Pre-Trained Transformers
【24h】

Extracting Multiple-Relations in One-Pass with Pre-Trained Transformers

机译:用预先培训的变压器提取一次通过的多次关系

获取原文

摘要

The state-of-the-art solutions for extracting multiple entity-relations from an input paragraph always require a multiple-pass encoding on the input. This paper proposes a new solution that can complete the multiple entity-relations extraction task with only one-pass encoding on the input corpus, and achieve a new state-of-the-art accuracy performance, as demonstrated in the ACE 2005 benchmark. Our solution is built on top of the pre-trained self-attentive models (Transformer). Since our method uses a single-pass to compute all relations at once, it scales to larger datasets easily; which makes it more usable in real-world applications.~1
机译:用于从输入段落中提取多个实体关系的最先进的解决方案总是需要在输入上进行多级通过编码。本文提出了一种新的解决方案,可以在输入语料库上只能完成多个实体 - 关系提取任务,实现新的最先进的准确性表现,如ACE 2005基准测试。我们的解决方案建于预先培训的自定义模型(变压器)之上。由于我们的方法立即使用一次通过来计算所有关系,因此它可以轻松地缩放到更大的数据集;这使得在现实世界应用中可以更具可用。〜1

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号