首页> 外文会议>IEEE International Conference on Software Engineering and Service Science >A Few-shot Learning Method Based on Bidirectional Encoder Representation from Transformers for Relation Extraction
【24h】

A Few-shot Learning Method Based on Bidirectional Encoder Representation from Transformers for Relation Extraction

机译:基于双向编码器表示的几次射击学习方法,用于改变相关的变压器

获取原文

摘要

Relation extraction is one of the fundamental subtasks of the information extraction. The purpose is to determine the implicit relation between two entities in a sentence. Therefore, Convolutional Neural Networks and Feature Attention-based Prototypical Networks (CNN-Proto-FATT), a typical few-shot learning method, is proposed and achieve competitive performance. However, convolutional neural networks suffer from the insufficient instances of relation in real scenes, leading to undesirable results. To extract long-distance features more comprehensively, the pre-trained model Bidirectional Encoder Representation from Transformers (BERT) is incorporated into CNN-Proto-FATT. In this model, named Bidirectional Encoder Representation from Transformers and Feature Attention-based Prototypical Networks (BERT-Proto-FATT), the multi-head attention helps the network extract semantic features cross long- and short-distance to enhance the encoded representations. Experimental results indicate that BERT-Proto-FATT demonstrates significant improvements on the FewRel dataset.
机译:关系提取是信息提取的基本子组织之一。目的是确定句子中的两个实体之间的隐式关系。因此,提出了卷积神经网络和特征注意力的原型网络(CNN-PROVO-FATT),典型的少量学习方法,并实现竞争性能。然而,卷积神经网络遭受了实际场景中的关系的不足,导致不良结果。为了更全面地提取长距离特征,从变压器(BERT)的预先训练的模型双向编码器表示纳入CNN-PROVO-FATT。在该模型中,从变压器命名双向编码器表示,并采用基于关注的原型网络(BERT-PROTO-FATT),多主题有助​​于网络提取语义特征交叉,短距离,以增强编码的表示。实验结果表明,BERT-PROTO-FATT在林阶数据集上表现出显着的改进。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号