首页> 外文会议>China National Conference on Computational Linguistics >Recognizing Textual Entailment via Multi-task Knowledge Assisted LSTM
【24h】

Recognizing Textual Entailment via Multi-task Knowledge Assisted LSTM

机译:通过多任务知识辅助LSTM认识到文本素质

获取原文

摘要

Recognizing Textual Entailment (RTE) plays an important role in NLP applications like question answering, information retrieval, etc. Most previous works either use classifiers to employ elaborately designed features and lexical similarity or bring distant supervision and reasoning technique into RTE task. However, these approaches are hard to generalize due to the complexity of feature engineering and are prone to cascading errors and data sparsity problems. For alleviating the above problems, some work use LSTM-based recurrent neural network with word-by-word attention to recognize textual entailment. Nevertheless, these work did not make full use of knowledge base (KB) to help reasoning. In this paper, we propose a deep neural network architecture called Multi-task Knowledge Assisted LSTM (MKAL), which aims to conduct implicit inference with the assistant of KB and use predicate-to-predicate attention to detect the entailment between predicates. In addition, our model applies a multi-task architecture to further improve the performance. The experimental results show that our proposed method achieves a competitive result compared to the previous work.
机译:认识文字蕴涵(RTE)起着NLP应用,如问答,信息检索等多数以前的作品无论是使用分类采用精心设计的功能和词汇的相似性或带来遥远的监督和推理技术引入RTE任务具有重要作用。然而,这些方法都很难一概而论,由于功能设计的复杂性,而且容易级联错误和数据稀疏的问题。为了缓解上述问题,有字的字注意的一些基础工作LSTM使用递归神经网络识别文字蕴涵。不过,这些工作并没有做出帮助推理充分利用知识库(KB)的。在本文中,我们提出了一个深层神经网络结构称为辅助LSTM(MKAL)多任务知识,进行推理隐其目的与KB的助手和使用谓语谓语注意检测谓词之间的蕴涵。此外,我们的模型应用多任务架构,以进一步提高性能。实验结果表明,相比以前的工作我们提出的方法实现了有竞争力的结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号