首页> 外文期刊>Automatica Sinica, IEEE/CAA Journal of >Convolutional multi-head self-attention on memory for aspect sentiment classification
【24h】

Convolutional multi-head self-attention on memory for aspect sentiment classification

机译:卷积多头自我上记忆中的超强情绪分类

获取原文
获取原文并翻译 | 示例
           

摘要

This paper presents a method for aspect based sentiment classification tasks, named convolutional multi-head self-attention memory network (CMA-MemNet). This is an improved model based on memory networks, and makes it possible to extract more rich and complex semantic information from sequences and aspects. In order to fix the memory network's inability to capture context-related information on a word-level, we propose utilizing convolution to capture n-gram grammatical information. We use multi-head self-attention to make up for the problem where the memory network ignores the semantic information of the sequence itself. Meanwhile, unlike most recurrent neural network (RNN) long short term memory (LSTM), gated recurrent unit (GRU) models, we retain the parallelism of the network. We experiment on the open datasets SemEval-2014 Task 4 and SemEval-2016 Task 6. Compared with some popular baseline methods, our model performs excellently.
机译:本文提出了一种基于方面的情绪分类任务的方法,名为卷积多头自我注意存储器网络(CMA-MEMNET)。这是基于内存网络的改进模型,并且可以从序列和方面提取更丰富和复杂的语义信息。为了解决内存网络无法在单词级别捕获上下文相关信息,我们建议利用卷积来捕获n-gram语法信息。我们使用多头自我注意来弥补内存网络忽略序列本身的语义信息的问题。同时,与大多数复发性神经网络(RNN)长短期内存(LSTM)不同,所需的经常性单元(GRU)模型,我们保留了网络的并行性。我们在Open DataSets Semeval-2014任务4和Semeval-2016任务6上进行实验6.与一些流行的基线方法相比,我们的模型表现出色。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号