首页> 外文会议>International joint conference on natural language processing >YNU-HPCC at IJCNLP-2017 Task 5: Multi-choice Question Answering in Exams Using an Attention-based LSTM Model
【24h】

YNU-HPCC at IJCNLP-2017 Task 5: Multi-choice Question Answering in Exams Using an Attention-based LSTM Model

机译:Ynu-HPCC在IJCNLP-2017任务5:使用基于关注的LSTM模型的考试中的多项选择问题

获取原文

摘要

A shared task is a typical question answering task that aims to test how accurately the participants can answer the question-s in exams. Typically, for each question, there are four candidate answers, and only one of the answers is correct. The existing methods for such a task usually implement a recurrent neural network (RNN) or long short-term memory (LSTM). However, both RNN and LSTM are biased models in which the words in the tail of a sentence are more dominant than the words in the header. In this paper, we propose the use of an attention-based LSTM (AT-LSTM) model for these tasks. By adding an attention mechanism to the standard L-STM, this model can more easily capture long contextual information. Our submission ranked first among 35 teams in terms of the accuracy at the IJCNLP-2017 multi-choice question answering in Exams for all datasets.
机译:共享任务是一个典型的问题,旨在测试参与者在考试中如何回答问题的准确性问题。通常,对于每个问题,有四个候选答案,只有一个答案是正确的。这种任务的现有方法通常实施经常性的神经网络(RNN)或长短期存储器(LSTM)。然而,RNN和LSTM都是偏置模型,其中句子的尾部中的单词比标题中的单词更占主导地位。在本文中,我们建议使用基于关注的LSTM(AT-LSTM)模型来实现这些任务。通过向标准L-STM添加注意机制,该模型可以更容易地捕获长的上下文信息。我们的提交在IJCNLP-2017-2017多项选择问题的准确性方面排名第35队在所有数据集的考试中回答的准确性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号