首页> 外文会议>International Symposium on Advanced Parallel Processing Technologies >DA-BERT: Enhancing Part-of-Speech Tagging of Aspect Sentiment Analysis Using BERT
【24h】

DA-BERT: Enhancing Part-of-Speech Tagging of Aspect Sentiment Analysis Using BERT

机译:DA-BERT:使用BERT增强方面情感分析的词性标记

获取原文

摘要

With the development of Internet, text-based data from web have grown exponentially where the data carry large amount of valuable information. As a vital branch of sentiment analysis, the aspect sentiment analysis of short text on social media has attracted interests of researchers. Aspect sentiment classification is a kind of fine-grained textual sentiment classification. Currently, the attention mechanism is mainly combined with RNN (Recurrent Neural Network) or LSTM (Long Short-Term Memory) networks. Such neural network-based sentiment analysis model not only has a complicated computational structure, but also has computational dependence. To address the above problems and improve the accuracy of the target-based sentiment classification for short text, we propose a neural network model that combines deep-attention with Bidirectional Encoder Representations from Transformers (DA-BERT). The DA-BERT model can fully mine the relationships between target words and emotional words in a sentence, and it does not require syntactic analysis of sentences or external knowledge such as sentiment lexicon. The training speed of the proposed DA-BERT model has been greatly improved while removing the computational dependencies of RNN structure. Compared with LSTM, TD-LSTM, TC-LSTM, AT-LSTM, ATAE-LSTM, and PAT-LSTM, the results of experiments on the dataset SemEval2014 Task4 show that the accuracy of the DA-BERT model is improved by 13.63% on average where the word vector is 300 dimensions in aspect sentiment classification.
机译:随着Internet的发展,来自Web的基于文本的数据呈指数增长,其中数据携带大量有价值的信息。作为情感分析的重要分支,社交媒体上短文本的方面情感分析引起了研究人员的兴趣。方面情感分类是一种细粒度的文本情感分类。当前,注意力机制主要与RNN(递归神经网络)或LSTM(长短期记忆)网络相结合。这种基于神经网络的情感分析模型不仅具有复杂的计算结构,而且具有计算依赖性。为了解决上述问题并提高基于目标的短文本情感分类的准确性,我们提出了一种神经网络模型,该模型将深度注意与来自变压器的双向编码器表示(DA-BERT)相结合。 DA-BERT模型可以充分挖掘句子中目标词和情感词之间的关系,并且不需要对句子或外部知识(例如情感词典)进行句法分析。提出的DA-BERT模型的训练速度已大大提高,同时消除了RNN结构的计算依赖性。与LSTM,TD-LSTM,TC-LSTM,AT-LSTM,ATAE-LSTM和PAT-LSTM相比,在数据集SemEval2014 Task4上进行的实验结果表明,DA-BERT模型的准确度提高了13.63%在词义情感分类中,词向量为300维的平均值。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号