首页> 外文会议>Iranian Joint Congress on Fuzzy and Intelligent Systems >Attention-based Convolutional Neural Network for Answer Selection using BERT
【24h】

Attention-based Convolutional Neural Network for Answer Selection using BERT

机译:基于注意力的卷积神经网络使用BERT进行答案选择

获取原文

摘要

Question answering is at the heart of natural language processing and is composed of two sections: Reading Comprehension and Answer Selection. Prior to deep learning, all natural language processing solutions including Question Answering were based on statistical methods and researchers generated set of features based on text input. Answer Selection is a fundamental task in Question Answering, also a tough one because of the complicated semantic relations between questions and answers. Attention is a mechanism that has revolutionized deep learning community. Leveraging pretrained language models have made a breakthrough in most natural language processing tasks. Bert is one of the top pretrained deep language models that has achieved state-of-the-art on an extensive area of nlp tasks. In this paper we utilize an attention-based convolutional neural network. First, we employ BERT, a state-of-the-art pre-trained contextual as the embedding layer. Second, we enhance the model by adding some more attentive features. We evaluate the performance of our model on WikiQA dataset. Our experiments show that our model is superior to many other answer-selection models.
机译:问答是自然语言处理的核心,由两部分组成:阅读理解和答案选择。在深度学习之前,所有自然语言处理解决方案(包括问答)都基于统计方法,研究人员根据文本输入生成了一组功能。答案选择是问答中的一项基本任务,由于问题和答案之间的语义关系复杂,因此也是一项艰巨的任务。注意是一种机制,彻底改变了深度学习社区。利用预训练的语言模型,在大多数自然语言处理任务中都取得了突破。 Bert是最高级的经过预先训练的深度语言模型之一,它已经在广泛的nlp任务领域达到了最先进的水平。在本文中,我们利用了基于注意力的卷积神经网络。首先,我们使用BERT(最先进的预训练上下文)作为嵌入层。其次,我们通过添加一些更注意的功能来增强模型。我们在WikiQA数据集上评估模型的性能。我们的实验表明,我们的模型优于许多其他答案选择模型。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号