首页> 外文期刊>Neurocomputing >A cognitive brain model for multimodal sentiment analysis based on attention neural networks
【24h】

A cognitive brain model for multimodal sentiment analysis based on attention neural networks

机译:基于关注神经网络的多式联运观念认知脑模型

获取原文
获取原文并翻译 | 示例

摘要

Multimodal sentiment analysis is one of the most attractive interdisciplinary research topics in artificial intelligence (AI). Different from other classification issues, multimodal sentiment analysis of human is a much finer classification problem. However, most current work accept all multimodalities as the input together and then output final results at one time after fusion and decision processes. Rare models try to divide their models into more than one fusion modules with different fusion strategies for better adaption of different tasks. Additionally, most recent multimodal sentiment analysis methods pay great focuses on binary classification, but the accuracy of multi-classification still remains difficult to improve. Inspired by the emotional processing procedure in cognitive science, both binary and multi-classification abilities are improved in our method by dividing the complicated problem into smaller issues which are easier to be handled. In this paper, we propose a Hierarchal Attention-BiLSTM (Bidirectional Long-Short Term Memory) model based on Cognitive Brain limbic system (HALCB). HALCB splits the multimodal sentiment analysis into two modules responsible for two tasks, the binary classification and the multiclassification. The former module divides the input items into two categories by recognizing their polarity and then sends them to the latter module separately. In this module, Hash algorithm is utilized to improve the retrieve accuracy and speed. Correspondingly, the latter module contains a positive subnet dedicated for positive inputs and a negative sub-nets dedicated for negative inputs. Each of these binary module and two sub-nets in multi-classification module possesses different fusion strategy and decision layer for matching its respective function. We also add a random forest at the final link to collect outputs from all modules and fuse them at the decision-level at last. Experiments are conducted on three datasets and compare the results with baselines on both binary classification and multi-classification tasks. Our experimental results surpass the state-of-the-art multimodal sentiment analysis methods on both binary and multi-classification by a big margin. (c) 2020 Elsevier B.V. All rights reserved.
机译:多模式情绪分析是人工智能(AI)中最具吸引力的跨学科研究主题之一。与其他分类问题不同,人类的多式化情绪分析是一个更细的分类问题。然而,大多数当前工作接受所有多重差异作为输入,然后在融合和决策过程之后一次输出最终结果。罕见的模型尝试将其模型分成多个融合模块,具有不同的融合策略,以便更好地适应不同的任务。此外,大多数最近的多模式情绪分析方法付出伟大的重点是二进制分类,但多分类的准确性仍然难以改善。灵感来自认知科学中的情感处理程序,通过将复杂的问题划分为更易于处理的较小问题来改善二元和多分类能力。在本文中,我们提出了一种基于认知脑肢体系统(HALCB)的分层关注Bilstm(双向长短术语记忆)模型。 HALCB将多模式情绪分析分为两个组件,两项任务,二进制分类和多分类。前模块通过识别它们的极性将输入项目划分为两类,然后单独将它们发送到后一块模块。在该模块中,利用散列算法来提高检索精度和速度。相应地,后一块模块包含专用于正输入的正子网和专用于负输入的负子网。多分类模块中的两个二进制模块和两个子网具有不同的融合策略和决策层,用于匹配其各自的功能。我们还在最终链接中添加一个随机林,收集所有模块的输出并终于在决策级别熔化它们。实验在三个数据集上进行,并将结果与​​二进制分类和多分类任务的基线进行比较。我们的实验结果超越了二进制和多分类的最先进的多模式情绪分析方法。 (c)2020 Elsevier B.v.保留所有权利。

著录项

获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号