...
首页> 外文期刊>Information Processing & Management >Enhanced prototypical network for few-shot relation extraction
【24h】

Enhanced prototypical network for few-shot relation extraction

机译:用于几次拍摄关系提取的增强型原型网络

获取原文
获取原文并翻译 | 示例
           

摘要

Most existing methods for relation extraction tasks depend heavily on large-scale annotated data; they cannot learn from existing knowledge and have low generalization ability. It is urgent for us to solve the above problems by further developing few-shot learning methods. Because of the limitations of the most commonly used CNN model which is not good at sequence labeling and capturing long-range dependencies, we proposed a novel model that integrates the transformer model into a prototypical network for more powerful relation-level feature extraction. The transformer connects tokens directly to adapt to long sequence learning without catastrophic forgetting and is able to gain more enhanced semantic information by learning from several representation subspaces in parallel for each word. We evaluate our method on three tasks, including in-domain, cross-domain and cross-sentence tasks. Our method achieves a trade-off between performance and computation and has an approximately 8% improvement in different settings over the state-of-the-art prototypical network. In addition, our experiments also show that our approach is competitive when considering cross-domain transfer and cross-sentence relation extraction in few-shot learning methods.
机译:最现有的关于关系提取任务的方法大量取决于大规模的注释数据;他们无法从现有知识中学习并具有低的泛化能力。我们迫切需要通过进一步发展少量学习方法来解决上述问题。由于最常用的CNN模型的局限性不擅长序列标记和捕获远程依赖性,因此提出了一种新颖的模型,该模型将变压器模型集成到原型网络中,以实现更强大的关系级别提取。变压器直接连接令牌,以适应长序列学习,而无需灾难性忘记,并且能够通过从每个单词并行的多个表示子空间学习更加增强的语义信息。我们在三个任务中评估我们的方法,包括域名,跨域和跨句子任务。我们的方法在性能和计算之间实现了权衡,并且在最先进的原型网络上的不同设置有大约8%的改进。此外,我们的实验还表明,在考虑几次学习方法中,我们的方法在考虑跨域转移和跨句关系提取时具有竞争力。

著录项

  • 来源
    《Information Processing & Management》 |2021年第4期|102596.1-102596.17|共17页
  • 作者单位

    School of Computer University Of South China Hunan China;

    School of Computer University Of South China Hunan China;

    School of Computer University Of South China Hunan China Hunan provincial base for scientific and technological innovation cooperation Hunan China;

    School of Computer University Of South China Hunan China;

    Department of Computer Science and Technology Tsinghua University Beijing 100084 China;

  • 收录信息
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

    Few-shot learning; Transformer; Relation extraction;

    机译:少量学习;变压器;关系提取;

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号