首页> 外文会议> >RASNet: Segmentation for Tracking Surgical Instruments in Surgical Videos Using Refined Attention Segmentation Network
【24h】

RASNet: Segmentation for Tracking Surgical Instruments in Surgical Videos Using Refined Attention Segmentation Network

机译:RASNet:使用精细的注意力分割网络对手术视频中的手术器械进行分割

获取原文
获取外文期刊封面目录资料

摘要

Segmentation for tracking surgical instruments plays an important role in robot-assisted surgery. Segmentation of surgical instruments contributes to capturing accurate spatial information for tracking. In this paper, a novel network, Refined Attention Segmentation Network, is proposed to simultaneously segment surgical instruments and identify their categories. The U-shape network which is popular in segmentation is used. Different from previous work, an attention module is adopted to help the network focus on key regions, which can improve the segmentation accuracy. To solve the class imbalance problem, the weighted sum of the cross entropy loss and the logarithm of the Jaccard index is used as loss function. Furthermore, transfer learning is adopted in our network. The encoder is pre-trained on ImageNet. The dataset from the MICCAI EndoVis Challenge 2017 is used to evaluate our network. Based on this dataset, our network achieves state-of-the-art performance 94.65% mean Dice and 90.33% mean IOU.
机译:跟踪手术器械的分割在机器人辅助手术中起着重要作用。手术器械的分割有助于捕获准确的空间信息以进行跟踪。在本文中,提出了一种新颖的网络,即“精细注意力分割网络”,可以同时对手术器械进行分割并确定其类别。使用在分割中流行的U形网络。与以往的工作不同,采用关注模块帮助网络关注关键区域,可以提高分割的准确性。为了解决类不平衡问题,将交叉熵损失和Jaccard指数的对数的加权和用作损失函数。此外,我们的网络中采用了转移学习。编码器已在ImageNet上进行了预训练。来自MICCAI EndoVis Challenge 2017的数据集用于评估我们的网络。基于此数据集,我们的网络实现了94.65%的平均Dice和90.33%的平均IOU的最新性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号