...
首页> 外文期刊>Scientific Data >A 204-subject multimodal neuroimaging dataset to study language processing
【24h】

A 204-subject multimodal neuroimaging dataset to study language processing

机译:一个204个主题的多模式神经影像数据集,用于研究语言处理

获取原文
           

摘要

This dataset, colloquially known as the Mother Of Unification Studies (MOUS) dataset, contains multimodal neuroimaging data that has been acquired from 204 healthy human subjects. The neuroimaging protocol consisted of magnetic resonance imaging (MRI) to derive information at high spatial resolution about brain anatomy and structural connections, and functional data during task, and at rest. In addition, magnetoencephalography (MEG) was used to obtain high temporal resolution electrophysiological measurements during task, and at rest. All subjects performed a language task, during which they processed linguistic utterances that either consisted of normal or scrambled sentences. Half of the subjects were reading the stimuli, the other half listened to the stimuli. The resting state measurements consisted of 5minutes eyes-open for the MEG and 7minutes eyes-closed for fMRI. The neuroimaging data, as well as the information about the experimental events are shared according to the Brain Imaging Data Structure (BIDS) format. This unprecedented neuroimaging language data collection allows for the investigation of various aspects of the neurobiological correlates of language.
机译:这个数据集,称为统一研究的母亲(Mous)数据集,包含从204名健康人类受试者获得的多模式神经影像数据。神经影像协议由磁共振成像(MRI)组成,以获得关于脑解剖结构和结构连接的高空间分辨率的信息,以及任务期间的功能数据,休息。此外,磁性脑图(MEG)用于在任务期间获得高时的时间分辨率电生理学测量,静止。所有受试者都执行了语言任务,在此期间,它们处理了由正常或扰乱句子组成的语言话语。一半的受试者正在阅读刺激,另一半听取刺激。静息状态测量由5分钟的眼睛打开,为FMRI为MEG和7分钟的眼睛闭合。根据脑成像数据结构(BID)格式共享神经影像画数据,以及关于实验事件的信息。这种前所未有的神经影像数据收集允许调查语言的神经生物学相关的各个方面。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号