首页> 外文期刊>Sleep >Assessment of automated scoring of polysomnographic recordings in a population with suspected sleep-disordered breathing.
【24h】

Assessment of automated scoring of polysomnographic recordings in a population with suspected sleep-disordered breathing.

机译:在怀疑有睡眠呼吸障碍的人群中评估多导睡眠图记录的自动评分。

获取原文
获取原文并翻译 | 示例
           

摘要

STUDY OBJECTIVES: To assess the accuracy of an automated system (Morpheus I Sleep Scoring System) for analyzing and quantifying polysomnographic data from a population with sleep-disordered breathing. SETTING: Sleep laboratory affiliated with a tertiary care academic medical center. MEASUREMENTS AND RESULTS: 31 diagnostic polysomnograms were blindly analyzed prospectively with the investigational automated system and manually by 2 registered polysomnography technologists (M1 & M2) from the same laboratory. Sleep stages, arousals, periodic limb movements, and respiratory events (apneas and hypopneas) were scored by all 3. Agreement, Cohen kappa, and intraclass correlation coefficients were tabulated for each variable and compared between scoring pairs (A-M1, A-M2, M1-M2). The 26,876 epochs (224 hours of recording time) were analyzed. For sleep staging, agreement/kappa were A-M1: 78%/0.67, A-M2: 73%/0.61, and M1-M2: 82%/0.73. The mean respiratory disturbance indexes were M1: 20.6+/-23.0, M2: 22.5+/-24.5,and A: 23.7+/-23.4 events per hour of sleep. The respiratory disturbance index concordance between each scoring pair was excellent (intraclass correlation coefficients > or = 0.95 for all pairs), although there was disagreement in the classification of moderate sleep-disordered breathing (percentage of positive agreement: A-M1, 37.5% and A-M2, 44.4%) defined as a respiratory disturbance index between 15 and 30 events per hour of sleep. For respiratory-event detection, agreement/kappa were A-M1 and A-M2: 90%/0.66 and M1-M2: 95%/0.82. The agreement and kappa for limb movement detection were A-M1: 93%/0.68, A-M2: 92%/0.66, and M1-M2: 96%/0.77. The scoring of arousals was less reliable (agreement range: 76%-84%, kappa range: 0.28-0.57) for all pairs. CONCLUSIONS: Agreement between manual scorers in a population with moderate sleep-disordered breathing was close to the average pairwise agreement of 87% reported in the Sleep Heart Health Study. The automated classification of sleep stages was also close to this standard. The automated scoring system holds promise as a rapid method to score polysomnographic records, but expert verification of the automated scoring is required.
机译:研究目的:评估自动化系统(Morpheus I睡眠评分系统)的准确性,该系统用于分析和量化来自睡眠呼吸障碍人群的多导睡眠图数据。地点:附属于三级护理学术医学中心的睡眠实验室。测量和结果:31个诊断性多导睡眠图通过研究自动化系统进行了前瞻性分析,并由来自同一实验室的2名注册多导睡眠图技术人员(M1和M2)进行了手动操作。所有3人都对睡眠阶段,唤醒,肢体周期性运动和呼吸事件(呼吸暂停和呼吸不足)进行了评分。将每个变量的一致性,Cohen kappa和组内相关系数制成表格,并在得分对之间进行比较(A-M1,A-M2 ,M1-M2)。分析了26,876个时期(记录时间为224小时)。对于睡眠分期,一致性/ kappa为A-M1:78%/ 0.67,A-M2:73%/ 0.61和M1-M2:82%/ 0.73。平均呼吸骚扰指数为M1:20.6 +/- 23.0,M2:22.5 +/- 24.5,和A:每小时睡眠23.7 +/- 23.4事件。尽管在中度睡眠呼吸障碍的分类上存在分歧(阳性同意的百分比:A-M1、37.5%和70.5%),各得分对之间的呼吸障碍指数一致性极好(所有对的类内相关系数>或= 0.95)。 A-M2(44.4%)定义为每小时睡眠15至30事件之间的呼吸障碍指数。对于呼吸事件检测,一致性/ kappa为A-M1和A-M2:90%/ 0.66和M1-M2:95%/ 0.82。肢体运动检测的一致性和kappa为:A-M1:93%/ 0.68,A-M2:92%/ 0.66,M1-M2:96%/ 0.77。所有配对的唤醒得分都不那么可靠(协议范围:76%-84%,kappa范围:0.28-0.57)。结论:睡眠中度呼吸异常人群中的手动评分者之间的一致性接近“睡眠心脏健康研究”中报告的平均成对一致性87%。睡眠阶段的自动分类也接近该标准。自动评分系统有望成为对多导睡眠图记录进行评分的快速方法,但是需要对自动评分进行专家验证。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号