首页> 外文会议>IEEE International Conference on Acoustics, Speech and Signal Processing >Gated Mechanism for Attention Based Multi Modal Sentiment Analysis
【24h】

Gated Mechanism for Attention Based Multi Modal Sentiment Analysis

机译:基于注意力的多模态情感分析的门控机制

获取原文

摘要

Multimodal sentiment analysis has recently gained popularity because of its relevance to social media posts, customer service calls and video blogs. In this paper, we address three aspects of multimodal sentiment analysis; 1. Cross modal interaction learning, i.e. how multiple modalities contribute to the sentiment, 2. Learning long-term dependencies in multimodal interactions and 3. Fusion of unimodal and cross modal cues. Out of these three, we find that learning cross modal interactions is beneficial for this problem. We perform experiments on two benchmark datasets, CMU Multimodal Opinion level Sentiment Intensity (CMU-MOSI) and CMU Multimodal Opinion Sentiment and Emotion Intensity (CMU-MOSEI) corpus. Our approach on both these tasks yields accuracies of 83.9% and 81.1% respectively, which is 1.6% and 1.34% absolute improvement over current state-of-the-art.
机译:由于多模式情感分析与社交媒体帖子,客户服务电话和视频博客相关,因此最近变得越来越流行。在本文中,我们探讨了多模态情感分析的三个方面。 1.跨模态交互学习,即多种模态如何影响情绪; 2.学习多模态交互中的长期依存关系;以及3.单模态和跨模态线索的融合。在这三者中,我们发现学习交叉模式的相互作用对这个问题是有益的。我们在两个基准数据集CMU多模态观点水平情感强度(CMU-MOSI)和CMU多模态观点情感和情绪强度(CMU-MOSEI)语料库上进行实验。我们在这两个任务上的方法分别产生了83.9%和81.1%的准确度,与当前的最新技术相比,绝对提高了1.6%和1.34%。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号