首页> 外文期刊>Computational intelligence and neuroscience >Leveraging Contextual Sentences for Text Classification by Using a Neural Attention Model
【24h】

Leveraging Contextual Sentences for Text Classification by Using a Neural Attention Model

机译:使用神经注意力模型利用上下文句子进行文本分类

获取原文
获取原文并翻译 | 示例
获取外文期刊封面目录资料

摘要

We explored several approaches to incorporate context information in the deep learning framework for text classification, including designing different attention mechanisms based on different neural network and extracting some additional features from text by traditional methods as the part of representation. We propose two kinds of classification algorithms: one is based on convolutional neural network fusing context information and the other is based on bidirectional long and short time memory network. We integrate the context information into the final feature representation by designing attention structures at sentence level and word level, which increases the diversity of feature information. Our experimental results on two datasets validate the advantages of the two models in terms of time efficiency and accuracy compared to the different models with fundamental AM architectures.
机译:我们探索了几种将上下文信息纳入深度学习框架进行文本分类的方法,包括基于不同的神经网络设计不同的注意力机制,以及通过传统方法从文本中提取一些额外的特征作为表示的一部分。我们提出了两种分类算法:一种是基于融合上下文信息的卷积神经网络,另一种是基于双向长短时间记忆网络。我们通过在句子层面和单词层面设计注意力结构,将上下文信息整合到最终的特征表示中,从而增加了特征信息的多样性。我们在两个数据集上的实验结果验证了两种模型在时间效率和准确性方面的优势,与具有基本增材制造架构的不同模型相比。

著录项

获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号