首页> 外文会议>IEEE International Conference on Software Engineering and Service Science >ELMo+Gated Self-attention Network Based on BiDAF for Machine Reading Comprehension
【24h】

ELMo+Gated Self-attention Network Based on BiDAF for Machine Reading Comprehension

机译:基于BiDAF的ELMo +门控自我注意网络对机器阅读的理解

获取原文

摘要

Machine reading comprehension (MRC) has always been a significant part of artificial intelligence and the focus in the field of natural language processing (NLP). Given context paragraph, to answer its query, we need to encode complex interaction between the question and the context. In the late years, with the rapid progress of neural network model and attention theory, MRC has made great advances. Especially, attention theory has been widely used in MRC. However, the accuracy of the previous classic baseline model has some upside potential and some of them did not take into account the long context dependence and polysemy. In this paper, for resolving the above problems and further improve the model, we introduce ELMo representations and add a gated self-attention layer to the Bi-Directional Attention Flow network (BIDAF). In addition, we employ the feature reuse method and modify the linear function of answer layer to further improve the performance. In the experiment of SQuAD, we prove this model greatly exceeds the baseline BIDAF model and its performance is close to the average level of human test, which proves the validity of this model.
机译:机器阅读理解(MRC)一直是人工智能的重要组成部分,也是自然语言处理(NLP)领域的重点。给定上下文段落,要回答其查询,我们需要对问题和上下文之间的复杂交互进行编码。近年来,随着神经网络模型和注意力理论的飞速发展,MRC取得了长足的进步。特别是,注意力理论已在MRC中得到了广泛的应用。但是,以前的经典基线模型的准确性具有一定的上升潜力,其中一些没有考虑到长时间的上下文依赖性和多义性。在本文中,为了解决上述问题并进一步改进模型,我们引入了ELMo表示,并向双向注意力流网络(BIDAF)添加了门控的自我注意层。此外,我们采用特征重用方法并修改了答案层的线性函数,以进一步提高性能。在SQuAD的实验中,我们证明了该模型大大超过了基线BIDAF模型,并且其性能接近人工测试的平均水平,从而证明了该模型的有效性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号