...
首页> 外文期刊>ACM Transactions on Internet Technology >Emo2Vec: Learning Emotional Embeddings via Multi-Emotion Category
【24h】

Emo2Vec: Learning Emotional Embeddings via Multi-Emotion Category

机译:EMO2VEC:通过多情感类别学习情绪嵌入

获取原文
获取原文并翻译 | 示例
           

摘要

Sentiment analysis or opinion mining for subject information extraction from the text has become more and more dependent on natural language processing, especially for business and healthcare, since the online products and service reviews affect the consuming behaviors. Word embeddings that can map the words to low-dimensional vector representations have been widely used in natural language processing tasks. But the word embeddings based on context such as Word2Vec and GloVe fail to capture the sentiment information. Most of existing sentiment analysis methods incorporate emotional polarity (positive and negative) to improve the sentiment embeddings for the emotion classification. This article takes advantage of an emotional psychology model to learn the emotional embeddings in Chinese first. In order to combine the semantic space and an emotional space, we present two different purifying models from local (LPM) and global (GPM) perspectives based on Plutchik's wheel of emotions to add the emotional information into word vectors. The two models aim to improve the word vectors so that not only the semantically similar words but also the sentimentally similar words can be closer than before. The Plutchik's wheel of emotions model can give eight-dimensional vector for one word in emotional space that can capture more sentiment information than the binary polarity labels. The obvious advantage of the local purifying model is that it can be fit for any pretrained word embeddings. For the global purifying model, we can get the final emotional embeddings at once. These models have been extended to handle English texts. The experimental results on Chinese and English datasets show that our purifying model can improve the conventional word embeddings and some proposed sentiment embeddings for sentiment classification and multi-emotion classification.
机译:由于在线产品和服务审查影响消费行为,因此来自文本的主题信息提取的情感分析或意见挖掘越来越依赖于自然语言处理,特别是对商业和医疗保健。 Word Embeddings可以将单词映射到低维矢量表示的单词已广泛用于自然语言处理任务中。但是基于Word2VEC和手套等上下文的单词嵌入未能捕获情绪信息。大多数现有的情绪分析方法都包含情绪极性(正负),以改善情绪分类的情绪嵌入。本文利用了一种情绪心理学模型来学习中文的情绪嵌入。为了结合语义空间和情绪空间,我们基于Plutchik的情绪的轮子向来自本地(LPM)和全球(GPM)的透视图提供了两种不同的净化模型,以将情绪信息添加到词向量中。这两种模型的目的是改善单词矢量,使得不仅是语义上类似的单词,而且不仅可以比以前更接近的语义上类似的单词。 Plutchik的情绪车轮模型可以在情绪空间中为一个字提供八维矢量,可以捕获比二元极性标签更多的情绪信息。本地净化模型的明显优势是它适合任何预磨损的单词嵌入。对于全球净化模型,我们可以立即获得最终情绪嵌入品。这些模型已扩展以处理英文文本。中国和英语数据集的实验结果表明,我们的净化模型可以改善传统的Word Embeddings,以及一些拟议的情感嵌入,用于情感分类和多情感分类。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号