首页> 美国卫生研究院文献>other >Using convolutional neural networks to estimate time-of-flight from PET detector waveforms
【2h】

Using convolutional neural networks to estimate time-of-flight from PET detector waveforms

机译:使用卷积神经网络从PET检测器波形估计飞行时间

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

Although there have been impressive strides in detector development for time-of-flight positron emission tomography (PET), most detectors still make use of simple signal processing methods to extract the time-of-flight information from the detector signals. In most cases, the timing pick-off for each waveform is computed using leading edge discrimination or constant fraction discrimination, as these were historically easily implemented with analog pulse processing electronics. However, now with the availability of fast waveform digitizers, there is opportunity to make use of more of the timing information contained in the coincident detector waveforms with advanced signal processing techniques. Here we describe the application of deep convolutional neural networks (CNNs), a type of machine learning, to estimate time-of-flight directly from the pair of digitized detector waveforms for a coincident event. One of the key features of this approach is the simplicity in obtaining ground-truth-labeled data needed to train the CNN: the true time-of-flight is determined from the difference in path length between the positron emission and each of the coincident detectors, which can be easily controlled experimentally. The experimental setup used here made use of two photomultiplier tube-based scintillation detectors, and a point source, stepped in 5 mm increments over a 15 cm range between the two detectors. The detector waveforms were digitized at 10 GS/s using a bench-top oscilloscope. The results shown here demonstrate that CNN-based time-of-flight estimation improves timing resolution by 20% compared to leading edge discrimination (231 ps vs. 185 ps), and 23% compared to constant fraction discrimination (242 ps vs 185 ps). By comparing several different CNN architectures, we also showed that CNN depth (number of convolutional and fully connected layers) had the largest impact on timing resolution, while the exact network parameters, such as convolutional filter size and number of feature maps, had only a minor influence.
机译:尽管在飞行时间正电子发射断层扫描(PET)的探测器开发方面取得了令人印象深刻的进步,但大多数探测器仍利用简单的信号处理方法从探测器信号中提取飞行时间信息。在大多数情况下,使用前沿判别法或恒定分数判别法可计算每个波形的定时信号,因为从历史上看,它们很容易用模拟脉冲处理电子设备实现。但是,现在有了快速的波形数字化仪,就有机会利用先进的信号处理技术来利用同步检测器波形中包含的更多定时信息。在这里,我们描述了深度卷积神经网络(CNN)(一种机器学习)的应用,它可以直接从一对同时发生的事件的数字化检测器波形中估算飞行时间。此方法的主要特征之一是,获得训练CNN所需的带有地面真相标记的数据非常简单:真正的飞行时间是根据正电子发射与每个重合探测器之间的路径长度差异来确定的,可以通过实验轻松控制。此处使用的实验装置使用了两个基于光电倍增管的闪烁探测器和一个点光源,该点光源在两个探测器之间的15 cm范围内以5 mm的增量步进。使用台式示波器将检测器波形数字化为10 GS / s。此处显示的结果表明,基于CNN的飞行时间估算与前沿鉴别(231 ps与185 ps)相比,将时序分辨率提高了20%,与恒定分数鉴别(242 ps与185 ps)相比,提高了23%。 。通过比较几种不同的CNN架构,我们还表明CNN深度(卷积层和完全连接层的数量)对时序分辨率的影响最大,而确切的网络参数(例如卷积滤波器的大小和特征图的数量)只有一个较小的影响。

著录项

  • 期刊名称 other
  • 作者

    Eric Berg; Simon R. Cherry;

  • 作者单位
  • 年(卷),期 -1(63),2
  • 年度 -1
  • 页码 02LT01
  • 总页数 15
  • 原文格式 PDF
  • 正文语种
  • 中图分类
  • 关键词

  • 入库时间 2022-08-21 11:08:14

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号