首页> 外文期刊>Journal of intelligent & fuzzy systems: Applications in Engineering and Technology >Attention-based LSTM, GRU and CNN for short text classification
【24h】

Attention-based LSTM, GRU and CNN for short text classification

机译:基于注意力的LSTM,GRU和CNN用于短文本分类

获取原文
获取原文并翻译 | 示例
       

摘要

Text classification is a fundamental task in Nature Language Processing(NLP). However, with the challenge of complex semantic information, how to extract useful features becomes a critical issue. Different from other traditional meth-ods, we propose a new model based on two parallel RNNs architecture, which captures context information through LSTM and GRU respectively and simultaneously. Motivated by the siamese network, our proposed architecture generates attention matrix through calculating similarity between the parallel captured context information, which ensures the effectiveness of extracted features and further improves classification results. We evaluate our proposed model on six text classification tasks. The result of experiments shows that the ABLGCNN model proposed in this paper has the faster convergence speed and the higher precision than other models.
机译:文本分类是自然语言处理(NLP)中的基本任务。 然而,由于复杂语义信息的挑战,如何提取有用功能成为一个关键问题。 与其他传统的Meth-OD不同,我们提出了一种基于两个并行RNN架构的新模型,其分别通过LSTM和GRU捕获上下文信息。 通过暹罗网络,我们所提出的体系结构通过计算并行捕获的上下文信息之间的相似性来产生注意矩阵,这确保了提取特征的有效性并进一步提高了分类结果。 我们在六个文本分类任务中评估我们所提出的模型。 实验结果表明,本文提出的ABLGCNN模型具有比其他模型更快的会聚速度和更高的精度。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号