首页> 外文会议>International Conference on Functional Materials and Chemical Engineering >Two layers LSTM with attention for multi-choice question answering in exams
【24h】

Two layers LSTM with attention for multi-choice question answering in exams

机译:两层LSTM注意考试中的多项选择问题

获取原文

摘要

Question Answering in Exams is typical question answering task that aims to test how accurately the model could answer the questions in exams. In this paper, we use general deep learning model to solve the multi-choice question answering task. Our approach is to build distributed word embedding of question and answers instead of manually extracting features or linguistic tools, meanwhile, for improving the accuracy, the external corpus is introduced. The framework uses a two layers LSTM with attention which get a significant result. By contrast, we introduce the simple long short-term memory (QA-LSTM) model and QA-LSTM-CNN model and QA-LSTM with attention model as the reference. Experiment demonstrate superior performance of two layers LSTM with attention compared to other models in question answering task.
机译:在考试中的问题是典型的问题回答任务,旨在测试模型如何在考试中回答问题的准确性。 在本文中,我们使用一般深入学习模型来解决多项选择问题的应答任务。 我们的方法是构建分布式词嵌入问题和答案,而不是手动提取功能或语言工具,同时为了提高准确性,介绍了外部语料库。 该框架使用两层LSTM,引起了重大结果。 相比之下,我们介绍了简单的长短期内存(QA-LSTM)模型和QA-LSTM-CNN模型和QA-LSTM,作为注意模型作为参考。 实验表明,与问题接听任务的其他模型相比,两层LSTM的优异性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号