【24h】

Temporal relationship between auditory and visual prosodic cues

机译:听觉和视觉韵律线索之间的时间关系

获取原文

摘要

It has been reported that non-articulatory visual cues to prosody tend to align with auditory cues, emphasizing auditory events that are in close alignment (visual alignment hypothesis). We investigated the temporal relationship between visual and auditory prosodic cues in a large corpus of utterances to determine the extent to which non-articulatory visual prosodic cues align with auditory ones. Six speakers saying 30 sentences in three prosodic conditions (x2 repetitions) were recorded in a dialogue exchange task, to measure how often eyebrow movements and rigid head tilts aligned with auditory prosodic cues, the temporal distribution of such movements, and the variation across prosodic conditions. The timing of brow raises and head tilts were not aligned with auditory cues, and the occurrence of visual cues was inconsistent, lending little support for the visual alignment hypothesis. Different types of visual cues may combine with auditory cues in different ways to signal prosody.
机译:据报道,对于韵律的非关节视觉提示往往与听觉提示一致,强调听觉事件是紧密对准的(视觉对准假设)。我们调查了大量话语中视觉和听觉韵律线索之间的时间关系,以确定非发音视觉韵律线索与听觉线索对齐的程度。在对话交流任务中记录了六位讲者在三个韵律条件下(x2次重复)说了30个句子,以测量眉毛运动和僵硬的头部倾斜与听觉韵律线索对齐的频率,这种运动的时间分布以及跨韵律条件的变化。额头抬起和头倾斜的时间与听觉提示不一致,视觉提示的发生也不一致,几乎没有为视觉对齐假设提供支持。不同类型的视觉提示可能会以不同的方式与听觉提示结合,以发出韵律。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号