首页> 外文学位 >Emotions in engineering: Methods for the interpretation of ambiguous emotional content.
【24h】

Emotions in engineering: Methods for the interpretation of ambiguous emotional content.

机译:工程中的情感:解释模棱两可的情感内容的方法。

获取原文
获取原文并翻译 | 示例

摘要

Emotion has intrigued researchers for generations. This fascination has permeated the engineering community, motivating the development of affective computational models for the classification of affective states. However, human emotion remains notoriously difficult to interpret computationally both because of the mismatch between the emotional cue generation (the speaker) and perception (the observer) processes and because of the presence of complex emotions, emotions that contain shades of multiple affective classes. Proper representations of emotion would ameliorate this problem by introducing multidimensional characterizations of the data that permit the quantification and description of the varied affective components of each utterance. Currently, the mathematical representation of emotion is an area that is under-explored.;Research in emotion expression and perception provides a complex and human-centered platform for the integration of machine learning techniques and multimodal signal processing towards the design of interpretable data representations. The focus of this dissertation is to provide a computational description of human emotion perception and combine this knowledge with the information gleaned from emotion classification experiments to develop a mathematical characterization capable of interpreting naturalistic expressions of emotion utilizing a data representation method called Emotion Profiles.;The analysis of human emotion perception provides an understanding of how humans integrate audio and video information during emotional presentations. The goals of this work are to determine how audio and video information interact during the human emotional evaluation process and to identify a subset of the features that contribute to specific types of emotion perception. We identify perceptually-relevant feature modulations and multi-modal feature integration trends using statistical analyses of the evaluator reports.;The trends in evaluator reports are analyzed using emotion classification. We study evaluator performance using a combination of Hidden Markov Models (HMM) and Naive Bayes (NB) classification. The HMM classification is used to predict individual evaluator emotional assessments. The NB classification provides an estimate of the consistency of the evaluator's mental model of emotion. We demonstrate that evaluator reports created by evaluators with higher levels of estimated consistency are more accurately predicted than evaluator reports from evaluators that are less consistent.;The insights gleaned from the emotion perception and classification studies are aggregated to develop a novel emotional representation scheme, called Emotion Profiles (EP). The design of the EPs is predicated on the knowledge that naturalistic emotion expressions can be approximately described using one or more labels from a set of basic emotions. EPs are a quantitative measure expressing the degree of the presence or absence of a set of basic emotions within an expression. They avoid the need for a hard-labeled assignment by instead providing a method for describing the shades of emotion present in an utterance. These profiles can be used to determine a most likely assignment for an utterance, to map out the evolution of the emotional tenor of an interaction, or to interpret utterances that have multiple affective components. The Emotion-Profile technique is able to accurately identify the emotion of utterances with definable ground truths (emotions with an evaluator consensus) and is able to interpret the affective content of emotions with ambiguous emotional content (no evaluator consensus), emotions that are typically discarded during classification tasks.;The algorithms and statistical analyses presented in this work are tested using two databases. The first database is a combination of synthetic (facial information) and natural human (vocal information) cues. The affective content of the two modalities is either matched (congruent presentation) or mismatched (conflicting presentation). The congruent and conflicting presentations are used to assess the affective perceptual relevance of both individual modalities and the specific feature modulations of those modalities. The second database is an audio-visual + motion-capture database collected at the University of Southern California, the USC IEMOCAP database. This database is used to assess the efficacy of the EP technique for quantifying the emotional content of an utterance. The IEMOCAP database is also used in the classification studies to determine how well individual evaluators can be modeled and how accurately discrete emotional labels (e.g., angry, happy, sad, neutral) can be predicted given audio and motion-capture feature information.;The future directions of this work include the unification of the emotion perception, classification, and quantification studies. The classification framework will be extended to include evaluator-specific features (an extension of the emotion perception studies) and temporal features based on EP estimates. This unification will produce a classification framework that is not only more effective than previous versions, but is also able to adapt to specific user emotion production and perception styles.
机译:几代人都对情感产生了兴趣。这种迷恋已经渗透到工程界,从而激发了情感计算模型用于情感状态分类的发展。然而,众所周知,由于情绪提示生成(说话者)和知觉(观察者)过程之间的不匹配,以及由于存在复杂的情绪(包含多个情感类别的阴影),人类的情绪仍然难以用计算来解释。通过引入数据的多维特征,可以量化和描述每种话语的各种情感成分,正确表达情感可以缓解这个问题。当前,情感的数学表示是一个尚待研究的领域。情感表达和感知的研究提供了一个复杂且以人为中心的平台,用于将机器学习技术和多模式信号处理集成到可解释的数据表示的设计中。本论文的重点是提供对人类情绪感知的计算描述,并将这种知识与从情绪分类实验中收集的信息相结合,以开发一种数学表征,该数学表征能够使用一种称为“情绪概况”的数据表示方法来解释情绪的自然表达。对人类情感感知的分析提供了对人类在情感演示过程中如何整合音频和视频信息的理解。这项工作的目标是确定音频和视频信息在人类情感评估过程中的交互方式,并识别有助于特定类型的情感感知的特征子集。我们使用评估者报告的统计分析来识别与感知相关的特征调制和多模式特征整合趋势。;使用情感分类分析评估者报告中的趋势。我们使用隐马尔可夫模型(HMM)和朴素贝叶斯(NB)分类的组合研究评估者的表现。 HMM分类用于预测个人评估者的情绪评估。 NB分类提供了评估者情绪心理模型的一致性估计。我们证明,与一致性较差的评估者的评估报告相比,具有较高估计一致性的评估者创建的评估者报告的预测更为准确。从情感感知和分类研究中收集的见解被汇总起来,以开发出一种新颖的情感表示方案,称为情绪档案(EP)。 EP的设计基于以下知识:可以使用一组基本情感中的一个或多个标签来大致描述自然主义的情感表达。 EP是一种定量度量,用于表达表达式中一组基本情感的存在或不存在的程度。他们通过提供一种描述话语中存在的情绪阴影的方法来避免进行硬标签的分配。这些配置文件可用于确定话语的最可能分配,以勾画出互动的情感张力的演变,或解释具有多个情感成分的话语。 Emotion-Profile技术能够准确地识别具有可定义的地面事实的话语的情感(具有评估者共识的情感),并能够解释具有含糊的情感内容(没有评估者的共识)的情感的情感内容,这些情感通常被丢弃在分类任务中。;使用两个数据库测试了本文中介绍的算法和统计分析。第一个数据库是合成(面部信息)和自然人(语音信息)提示的组合。两种方式的情感内容是匹配的(一致的表示)或不匹配的(冲突的表示)。一致且相互矛盾的演示文稿用于评估单个模式的情感感知相关性以及这些模式的特定特征调制。第二个数据库是在南加州大学收集的视听+运动捕捉数据库,即USC IEMOCAP数据库。该数据库用于评估EP技术量化话语的情感内容的功效。 IEMOCAP数据库还用于分类研究中,以确定在给定音频和运动捕捉特征信息的情况下,可以对单个评估者进行建模的程度如何以及如何准确预测离散的情感标签(例如,愤怒,快乐,悲伤,中立)。这项工作的未来方向包括情感感知,分类的统一以及定量研究。分类框架将扩展到包括评估者特定的功能(情感感知研究的扩展)和基于EP估计的时间特征。这种统一将产生一个分类框架,该框架不仅比以前的版本更有效,而且还能够适应特定的用户情感产生和感知样式。

著录项

  • 作者

    Mower, Emily K.;

  • 作者单位

    University of Southern California.;

  • 授予单位 University of Southern California.;
  • 学科 Engineering Electronics and Electrical.
  • 学位 Ph.D.
  • 年度 2010
  • 页码 174 p.
  • 总页数 174
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号