首页> 外文期刊>IFAC PapersOnLine >K-Means Clustering-based Kernel Canonical Correlation Analysis for Multimodal Emotion Recognition ?
【24h】

K-Means Clustering-based Kernel Canonical Correlation Analysis for Multimodal Emotion Recognition ?

机译:k-means基于聚类的多式情绪识别的核心典可相关分析

获取原文
       

摘要

Emotion is an important part of human interaction. Emotional recognition can greatly promote human-centered interaction techniques. On this basis, multimodal feature fusion can effectively improve the emotion recognition rate. However, in the multimodal feature fusion at the feature level, most of the methods do not consider the intrinsic relationship between different modes. Only the fusion of analysis and transformation of the feature matrices of different modes does not make better use of modal differences to improve the recognition rate. This problem led us to propose feature fusion method based on K-Means clustering and kernel canonical correlation analysis (KCCA). Clustering makes the classification of features not classified by mode, but by the degree of influence on emotional labels, thus positively affecting the results of KCCA. The experimental results obtained on the Savee database show that the proposed K-Means based KCCA improves overall classification performance and produces higher recognition rate than that of the state of art methods, such as the Informed Segmentation and Labeling Approach.
机译:情绪是人类互动的重要组成部分。情绪识别可以大大促进以人为本的相互作用技术。在此基础上,多模式特征融合可以有效提高情感识别率。然而,在特征级别的多模式特征融合中,大多数方法都不考虑不同模式之间的内在关系。只有不同模式的特征矩阵的分析和转换的融合并不能更好地利用模态差异来提高识别率。此问题导致我们提出基于K-Means聚类和内核规范相关分析(KCCA)的特征融合方法。聚类使得不归类模式的特征分类,而是通过对情绪标签的影响程度,从而积极影响KCCA的结果。在Savee数据库上获得的实验结果表明,所提出的基于K型KCCA提高了整体分类性能,并产生了比现有技术的识别率更高,例如知情的分割和标签方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号