首页> 外文会议>International Conference on Advanced Computational Intelligence >Sentence representation and classification using attention and additional language information
【24h】

Sentence representation and classification using attention and additional language information

机译:使用注意事项和其他语言信息进行句子表示和分类

获取原文

摘要

Sentence representation is one of the foundational tasks in natural language processing, and long short term memory (LSTM) is a widely used tool to deal with the variable-length sentence. In this paper, a new LSTM-based sentence representation model is proposed for sentence classification task. By introducing a self-supervised method in the process of learning the hidden representation of the sentence, the proposed model automatically capture the syntactic and semantic information from the context and used as additional language information to learn better contextual hidden representation. Moreover, instead of using the final hidden representation of LSTM or the max (or average) pooling of the hidden representations over all the time step, we propose to generate the global representation of the sentence by combining all contextual hidden representations in an element-wise attention manner. We evaluate our model on three sentence classification tasks: sentiment classification, question type classification, and subjectivity classification. Experimental results show that the proposed model improves the accuracy of sentence classification compared to other sentence representation methods in all of the three tasks.
机译:句子表示是自然语言处理中的基本任务之一,长期短期记忆(LSTM)是处理可变长度句子的一种广泛使用的工具。本文提出了一种新的基于LSTM的句子表示模型,用于句子分类任务。通过在学习句子的隐藏表示过程中引入自我监督的方法,该模型自动从上下文中捕获了句法和语义信息,并用作附加的语言信息来学习更好的上下文隐藏表示。此外,我们建议不使用LSTM的最终隐藏表示形式,也不使用所有时间步长的隐藏表示形式的最大(或平均)合并量,而是建议通过以元素方式组合所有上下文隐藏表示形式来生成句子的全局表示形式。注意方式。我们在三个句子分类任务上评估我们的模型:情感分类,问题类型分类和主观性分类。实验结果表明,在所有这三个任务中,与其他句子表示方法相比,该模型提高了句子分类的准确性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号