首页> 外文会议>IEEE International Conference on Software Engineering and Service Science >Recurrent Graph Neural Networks for Text Classification
【24h】

Recurrent Graph Neural Networks for Text Classification

机译:递归图神经网络用于文本分类

获取原文

摘要

Text classification is an essential and classical problem in natural language processing. Traditional text classifiers often rely on many human-designed features. With the rise of deep learning, Recurrent Neural Networks and Convolutional Neural Networks have widely applied into text classification. Meanwhile, the success of Graph Neural Networks (GNN) on structural data has attracted many researchers to apply GNN to traditional NLP applications. However, when these methods use the GNN, they commonly ignore the word order information of the sentence. In this work, we propose a model that uses a recurrent structure to capture contextual information as far as possible when learning word representations, which keeps word orders information compared to GNN-based networks. Then, we use the idea of GNN's message passing to aggregate the contextual information and update the word hidden representation. Like GNN's readout operation, we employ a max-pooling layer that automatically judges which words play key roles in text classification to capture the critical components in texts. We conduct experiments on four widely used datasets, and the experimental results show that our model achieves significant improvements against RNN-based model and GNN-based model.
机译:文本分类是自然语言处理中必不可少的经典问题。传统的文本分类器通常依赖于许多人为设计的功能。随着深度学习的兴起,递归神经网络和卷积神经网络已广泛应用于文本分类。同时,图神经网络(GNN)在结构数据上的成功吸引了许多研究人员将GNN应用到传统的NLP应用程序中。但是,当这些方法使用GNN时,通常会忽略句子的单词顺序信息。在这项工作中,我们提出了一个模型,该模型在学习单词表示时使用递归结构来尽可能地捕获上下文信息,与基于GNN的网络相比,它可以保留单词顺序信息。然后,我们使用GNN消息传递的思想来聚合上下文信息并更新单词“隐藏表示”。像GNN的读出操作一样,我们采用了一个最大池化层,该层可自动判断哪些单词在文本分类中起关键作用,以捕获文本中的关键组成部分。我们对四个广泛使用的数据集进行了实验,实验结果表明,相对于基于RNN的模型和基于GNN的模型,我们的模型取得了显着改进。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号