In many studies, the graph convolution neural networks were used to solve different natural language processing (NLP) problems. However, few researches employ graph convolutional network for text classification, especially for short text classification. In this work, a special text graph of the short-text corpus is created, and then a short-text graph convolutional network (STGCN) is developed. Specifically, different topic models for short text are employed, and a short tex-t short-text graph based on the word co-occurrence, document word relations, and text topic information, is developed. The word and sentence representations generated by the STGCN are considered as the classification feature. In addition, a pre-trained word vector obtained by the BERTs hidden layer is employed, which greatly improves the classification effect of our model. The experimental results show that our model outperforms the state-of-the-art models on multiple short text datasets.
展开▼