首页> 外文期刊>Neurocomputing >ReMemNN: A novel memory neural network for powerful interaction in aspect-based sentiment analysis
【24h】

ReMemNN: A novel memory neural network for powerful interaction in aspect-based sentiment analysis

机译:Rememnn:一种新的基于方面情绪分析中强大互动的新内存神经网络

获取原文
获取原文并翻译 | 示例
           

摘要

Deep neural networks have been employed to analyze the sentiment of text sequences and achieved significant effect. However, these models still face the issues of weakness of pre-trained word embeddings and weak interaction between the specific aspect and the context in attention mechanism. The pre-trained word embeddings lack the specific semantic information from the context. The weak interaction results in poor attention weights and produces limited aspect dependent sentiment representation in aspect-based sentiment analysis (ABSA). In this paper, we propose a novel end-to-end memory neural network, termed Recurrent Memory Neural Network (ReMemNN), to mitigate the above-mentioned problems. In ReMemNN, to tackle weakness of pre-trained word embeddings, a specially module named embedding adjustment learning module is designed to transfer the pre-trained word embeddings into adjustment word embeddings. To tackle weak interaction in attention mechanism, a multielement attention mechanism is designed to generate powerful attention weights and more precise aspect dependent sentiment representation. Besides, an explicit memory module is designed to store these different representations and generate hidden states and representations. Extensive experimental results on all datasets show that ReMemNN outperforms typical baselines and achieve the state-of-the-art performance. Besides, these experimental results also demonstrate that ReMemNN is language-independent and dataset type-independent. (C) 2020 Elsevier B.V. All rights reserved.
机译:已经采用深神经网络来分析文本序列的情绪并取得了显着效果。然而,这些模型仍然面临着预先训练的单词嵌入的弱点和特定方面与注意机制的背景之间的弱相互作用。预先训练的单词eMbeddings从上下文缺少特定的语义信息。薄弱的相互作用导致注意力差,在基于方面的情绪分析(ABSA)中产生有限的方面依赖情绪表示。在本文中,我们提出了一种新的端到端记忆神经网络,称为经常性记忆神经网络(Rememnn),以减轻上述问题。在Rememnn中,为了解决预先训练的单词嵌入的弱点,一个特殊的模块名为嵌入调整学习模块,旨在将预先训练的单词嵌入式传输到调整字嵌入式中。为了解决注意力机制的弱相互作用,旨在产生强大的注意力和更精确的方面相关情绪表示来产生强大的关注机制。此外,一个显式存储器模块旨在存储这些不同的表示,并生成隐藏状态和表示。所有数据集的广泛实验结果表明,Rememnn优于典型的基线,实现最先进的性能。此外,这些实验结果还证明了Rememnn是独立于语言和数据集无关的。 (c)2020 Elsevier B.v.保留所有权利。

著录项

  • 来源
    《Neurocomputing》 |2020年第28期|66-77|共12页
  • 作者

    Liu Ning; Shen Bo;

  • 作者单位

    Beijing Jiaotong Univ Sch Elect & Informat Engn Beijing 100044 Peoples R China|Beijing Municipal Commiss Educ Key Lab Commun & Informat Syst Beijing Peoples R China;

    Beijing Jiaotong Univ Sch Elect & Informat Engn Beijing 100044 Peoples R China|Beijing Municipal Commiss Educ Key Lab Commun & Informat Syst Beijing Peoples R China;

  • 收录信息 美国《科学引文索引》(SCI);美国《工程索引》(EI);
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

    Aspect-based sentiment analysis; Natural language processing; Attention mechanism; Deep learning;

    机译:基于方面的情绪分析;自然语言处理;注意机制;深度学习;

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号