...
首页> 外文期刊>Multimedia Tools and Applications >Using recurrent neural network structure with Enhanced Multi-Head Self-Attention for sentiment analysis
【24h】

Using recurrent neural network structure with Enhanced Multi-Head Self-Attention for sentiment analysis

机译:采用经常性神经网络结构,具有增强的多头自我关注情绪分析

获取原文
获取原文并翻译 | 示例
           

摘要

Sentiment analysis is a process of analysis, processing, induction, and reasoning of subjective text with emotional color. It is a research direction of Natural Language Processing (NLP). It is often used to extract the attitudes towards someone or something of people. That can help users find potential problems to improve or predict. As one of the main resources of online media data, film review information is often used as a dataset in the field of sentiment analysis. Researchers put forward many models in sentiment analysis to analyze the film review dataset. Accuracy, precision, recall rate, F1-scores are important standards to measure the quality of a model. To improve these criteria, a new model is proposed in this paper. The new model combines a bidirectional Long Short-Term Memory network (biLSTM) or a bidirectional Gated Recurrent Unit (biGRU) and an Enhanced Multi-Head Self-Attention mechanism. The Enhanced Multi-Head Self-Attention is a two-layer modified Transformer encoder. This modified Transformer encoder is that its masking operation and the last feedforward layer are removed. Besides, the loss function of this new model is the sum of the weighted root mean square error (RMSE) and the cross entropy loss. The operation of this sum can improve the ability of auto-encoder to reproduce. That can improve classification accuracy. The proposed model is an autoencoder classification model. In this model, biLSTM or biGRU are used as encoders and decoders at both ends of the network. Enhanced Multi-Head Self-Attention is used to encode the inter-sentence information as the middle hidden layer. A four-layer autoencoder network model is constructed to perform sentiment analysis on movie review in this paper. The movie review data sets (IMDB movie comment data set and SST-2 sentiment data set) are used in experiments. Experiment results show that the proposed model performs better in terms of accuracy, precision, recall rate, and F1-scores comparing with the baseline models. BiLSTM is better than biGRU by comparing the effect of them in the model. Finally, Bidirectional Encoder Representations from Transformers (BERT) is used in our method instead of word2vec as a pre-training structure. Compared with the baseline model based on BERT, the proposed model is better.
机译:情感分析是一种以情绪颜色为主观文本的分析,加工,诱导和推理的过程。它是自然语言处理的研究方向(NLP)。它通常用于将态度提取到某人或某物的态度。这可以帮助用户找到改善或预测的潜在问题。作为在线媒体数据的主要资源之一,电影审查信息通常用作情感分析领域的数据集。研究人员提出了许多在情意分析中的模型,分析了电影评论数据集。准确性,精度,召回率,F1分数是测量模型质量的重要标准。为了改善这些标准,本文提出了一种新模型。新模型结合了双向短期内存网络(BILSTM)或双向门控复发单元(BIGRU)和增强的多头自我关注机制。增强的多头自我关注是双层修改的变压器编码器。该修改的变压器编码器是其掩蔽操作和最后的前馈层。此外,这个新模型的损失函数是加权根均方误差(RMSE)和交叉熵损失的总和。此总和的操作可以提高自动编码器再现的能力。这可以提高分类准确性。所提出的模型是AutoEncoder分类模型。在此模型中,Bilstm或Bigru在网络的两端用作编码器和解码器。增强的多头自我关注用于将句子间信息作为中间隐藏层编码。构建了四层自动拓网络模型,以对本文进行电影审查的情感分析。电影评论数据集(IMDB电影评论数据集和SST-2情绪数据集)用于实验中。实验结果表明,该建议的模型在比较与基线模型比较的准确性,精度,召回率和F1分数方面更好。通过比较模型中的效果,Bilstm比Bigru更好。最后,在我们的方法中使用来自变压器(BERT)的双向编码器表示,而不是Word2Vec作为预训练结构。与基于伯特的基线模型相比,所提出的模型更好。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号