首页> 外文会议>International joint conference on natural language processing >YNU-HPCC at IJCNLP-2017 Task 5: Multi-choice Question Answering in Exams Using an Attention-based LSTM Model
【24h】

YNU-HPCC at IJCNLP-2017 Task 5: Multi-choice Question Answering in Exams Using an Attention-based LSTM Model

机译:YNU-HPCC在IJCNLP-2017上的任务5:使用基于注意力的LSTM模型在考试中进行多项选择题回答

获取原文

摘要

A shared task is a typical question answering task that aims to test how accurately the participants can answer the question-s in exams. Typically, for each question, there are four candidate answers, and only one of the answers is correct. The existing methods for such a task usually implement a recurrent neural network (RNN) or long short-term memory (LSTM). However, both RNN and LSTM are biased models in which the words in the tail of a sentence are more dominant than the words in the header. In this paper, we propose the use of an attention-based LSTM (AT-LSTM) model for these tasks. By adding an attention mechanism to the standard L-STM, this model can more easily capture long contextual information. Our submission ranked first among 35 teams in terms of the accuracy at the IJCNLP-2017 multi-choice question answering in Exams for all datasets.
机译:共享任务是一种典型的问题解答任务,旨在测试参与者在考试中回答问题的准确性。通常,对于每个问题,都有四个候选答案,并且只有一个答案是正确的。用于此类任务的现有方法通常实现递归神经网络(RNN)或长短期记忆(LSTM)。但是,RNN和LSTM都是偏向模型,其中句子尾部的单词比标题中的单词更占优势。在本文中,我们建议针对这些任务使用基于注意力的LSTM(AT-LSTM)模型。通过向标准L-STM添加关注机制,此模型可以更轻松地捕获较长的上下文信息。就所有数据集而言,在IJCNLP-2017多项选择题答卷的准确性方面,我们的提交在35个团队中排名第一。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号