首页> 外文期刊>Journal on Multimodal User Interfaces >Head movements, facial expressions and feedback in conversations: empirical evidence from Danish multimodal data
【24h】

Head movements, facial expressions and feedback in conversations: empirical evidence from Danish multimodal data

机译:头部运动,面部表情和对话反馈:丹麦多模态数据的经验证据

获取原文
获取原文并翻译 | 示例

摘要

This article deals with multimodal feedback in two Danish multimodal corpora, i.e., a collection of map-task dialogues and a corpus of free conversations in first encounters between pairs of subjects. Machine learning techniques are applied to both sets of data to investigate various relations between the non-verbal behaviour—more specifically head movements and facial expressions—and speech with regard to the expression of feedback. In the map-task data, we study the extent to which the dialogue act type of linguistic feedback expressions can be classified automatically based on the non-verbal features. In the conversational data, on the other hand, non-verbal and speech features are used together to distinguish feedback from other multimodal behaviours. The results of the two sets of experiments indicate in general that head movements, and to a lesser extent facial expressions, are important indicators of feedback, and that gestures and speech disambiguate each other in the machine learning process.
机译:本文讨论了两个丹麦多模态语料库中的多模态反馈,即一组地图任务对话和一对主题之间的首次相遇中的自由对话语料库。机器学习技术应用于这两组数据,以研究非语言行为(更具体而言是头部运动和面部表情)与语音之间关于反馈表达的各种关系。在地图任务数据中,我们研究了基于非语言特征可以自动对语言反馈表达的对话行为类型进行分类的程度。另一方面,在对话数据中,非语言和语音功能一起使用,以将反馈与其他多模态行为区分开。两组实验的结果总体上表明,头部运动(在较小程度上是面部表情)是反馈的重要指标,并且手势和语音在机器学习过程中会造成歧义。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号