【24h】

Text Level Graph Neural Network for Text Classification

机译:文本级图神经网络用于文本分类

获取原文

摘要

Recently, researches have explored the graph neural network (GNN) techniques on text classification, since GNN does well in handling complex structures and preserving global information. However, previous methods based on GNN are mainly faced with the practical problems of fixed corpus level graph structure which do not support online testing and high memory consumption. To tackle the problems, we propose a new GNN based model that builds graphs for each input text with global parameters sharing instead of a single graph for the whole corpus. This method removes the burden of dependence between an individual text and entire corpus which support online testing, but still preserve global information. Besides, we build graphs by much smaller windows in the text, which not only extract more local features but also significantly reduce the edge numbers as well as memory consumption. Experiments show that our model outperforms existing models on several text classification datasets even with consuming less memory.
机译:最近,由于GNN在处理复杂结构和保留全局信息方面表现出色,因此研究探索了用于文本分类的图神经网络(GNN)技术。然而,基于GNN的现有方法主要面临固定语料库级别图结构的实际问题,这些结构不支持在线测试和高内存消耗。为了解决这些问题,我们提出了一个新的基于GNN的模型,该模型为具有全局参数共享的每个输入文本构建图形,而不是为整个语料库构建单个图形。这种方法消除了支持在线测试但仍保留全局信息的单个文本和整个语料库之间的依赖负担。此外,我们通过文本中较小的窗口来构建图形,这不仅提取了更多的局部特征,而且还显着减少了边数以及内存消耗。实验表明,即使消耗更少的内存,我们的模型也能在多个文本分类数据集上胜过现有模型。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号