首页> 外文期刊>Intelligent data analysis >An attention-gated convolutional neural network for sentence classification
【24h】

An attention-gated convolutional neural network for sentence classification

机译:用于句子分类的注意门控卷积神经网络

获取原文
获取原文并翻译 | 示例

摘要

The classification of sentences is very challenging, since sentences contain the limited contextual information. In this paper, we proposed an Attention-Gated Convolutional Neural Network (AGCNN) for sentence classification, which generates attention weights from the feature's context windows of different sizes by using specialized convolution encoders. It makes full use of limited contextual information to extract and enhance the influence of important features in predicting the sentence's category. Experimental results demonstrated that our model can achieve up to 3.1% higher accuracy than standard CNN models, and gain competitive results over the baselines on four out of the six tasks. Besides, we designed an activation function, namely, Natural Logarithm rescaled Rectified Linear Unit (NLReLU). Experiments showed that NLReLU can outperform ReLU and is comparable to other well-known activation functions on AGCNN.
机译:句子的分类非常具有挑战性,因为句子包含有限的上下文信息。在本文中,我们提出了一种句子分类所关注的召集卷积神经网络(AGCNN),它通过使用专门的卷积编码器来产生来自特征的上下文窗口的注意力。它充分利用有限的上下文信息来提取和增强重要特征在预测句子类别方面的影响。实验结果表明,我们的模型比标准CNN型号高达3.1%的准确性,并在六个任务中的四个中获得基线上的竞争结果。此外,我们设计了一种激活功能,即天然对数重读整流线性单元(NLRELU)。实验表明,NLRELU可以优于Relu,并且与AGCNN上的其他众所周知的激活功能相当。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号