首页> 中文期刊> 《中文信息学报》 >用于文本分类的局部化双向长短时记忆

用于文本分类的局部化双向长短时记忆

         

摘要

近年来,深度学习越来越广泛地应用于自然语言处理领域,人们提出了诸如循环神经网络(RNN)等模型来构建文本表达并解决文本分类等任务.长短时记忆(long short term memory,LSTM)是一种具有特别神经元结构的RNN.LSTM的输入是句子的单词序列,模型对单词序列进行扫描并最终得到整个句子的表达.然而,常用的做法是只把LSTM在扫描完整个句子时得到的表达输入到分类器中,而忽略了扫描过程中生成的中间表达.这种做法不能高效地提取一些局部的文本特征,而这些特征往往对决定文档的类别非常重要.为了解决这个问题,该文提出局部化双向LSTM模型,包括MaxBiLSTM和ConvBiLSTM.MaxBiLSTM直接对双向LSTM的中间表达进行max pooling.ConvBiLSTM对双向LSTM的中间表达先卷积再进行max pooling.在两个公开的文本分类数据集上进行了实验.结果表明,局部化双向LSTM尤其是ConvBiLSTM相对于LSTM有明显的效果提升,并取得了目前的最优结果.%Deep learning has shown great benefits for natural language processing in recent years.Models such as Recurrent Neural Networks (RNNs) have been proposed to extract text representation,which can be applied for text classification.Long short term memory (LSTM) is an advanced kind of RNN with special neural cells.LSTM accepts a sequence of words from a sentence scans over the whole sequence and outputs the representation of the sentence.However,customary practices use only the last representation LSTM produced for classification,ignoring all other intermediate representations.A clear drawback is that it could not capture efficiently local features that are very important for determining the sentence's class label.In this paper,we propose the local bidirectional long short term memory to deal with this problem,including MaxBiLSTM and ConvBiLSTM.MaxBiLSTM conducts a max pooling operation and ConvBiLSTM conducts a convolution operation followed with a max pooling operation on all intermediate representations generated by bidirectional LSTM.Experimental results on two public datasets for text classification show that local bidirectional LSTM,especially ConvBiLSTM,outperforms bidirectional LSTM consistently and reaches the state-of-the-art performances.

著录项

相似文献

  • 中文文献
  • 外文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号