首页> 外文会议>Conference on imaging processing >Multi-Object Model-based Multi-Atlas Segmentation for Rodent Brains using Dense Discrete Correspondences
【24h】

Multi-Object Model-based Multi-Atlas Segmentation for Rodent Brains using Dense Discrete Correspondences

机译:使用密集离散对应关系的基于多对象模型的啮齿动物大脑多图集分割

获取原文
获取外文期刊封面目录资料

摘要

The delineation of rodent brain structures is challenging due to low-contrast multiple cortical and subcortical organs that are closely interfacing to each other. Atlas-based segmentation has been widely employed due to its ability to delineate multiple organs at the same time via image registration. The use of multiple atlases and subsequent label fusion techniques has further improved the robustness and accuracy of atlas-based segmentation. However, the accuracy of atlas-based segmentation is still prone to registration errors; for example, the segmentation of in vivo MR images can be less accurate and robust against image artifacts than the segmentation of post mortem images. In order to improve the accuracy and robustness of atlas-based segmentation, we propose a multi-object, model-based, multi-atlas segmentation method. We first establish spatial correspondences across atlases using a set of dense pseudo-landmark particles. We build a multi-object point distribution model using those particles in order to capture inter- and intra- subject variation among brain structures. The segmentation is obtained by fitting the model into a subject image, followed by label fusion process. Our result shows that the proposed method resulted in greater accuracy than comparable segmentation methods, including a widely used ANTs registration tool.
机译:由于低对比度的多个皮层和皮层下器官彼此紧密相连,因此对啮齿类动物大脑结构的描述具有挑战性。基于图谱的分割由于其能够通过图像配准同时描绘多个器官的能力而被广泛采用。使用多个地图集和后续的标签融合技术进一步提高了基于地图集的分割的鲁棒性和准确性。但是,基于图集的分割的准确性仍然容易出现配准错误;例如,体内MR图像的分割与事后图像的分割相比,对图像伪影的准确性和鲁棒性较差。为了提高基于图集的分割的准确性和鲁棒性,我们提出了一种基于模型的多对象,多图集的分割方法。我们首先使用一组密集的伪地标粒子在地图集上建立空间对应关系。我们使用这些粒子建立了一个多对象点分布模型,以捕获大脑结构之间的受试者间和受试者内变异。通过将模型拟合到对象图像中,然后进行标签融合过程来获得分割。我们的结果表明,与包括广泛使用的ANTs注册工具的可比分段方法相比,所提出的方法具有更高的准确性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号