首页> 外文期刊>Neural processing letters >CT-UNet: Context-Transfer-UNet for Building Segmentation in Remote Sensing Images
【24h】

CT-UNet: Context-Transfer-UNet for Building Segmentation in Remote Sensing Images

机译:CT-UNET:用于在遥感图像中构建分割的上下文 - 转移 - UNET

获取原文
获取原文并翻译 | 示例
           

摘要

With the proliferation of remote sensing images, how to segment buildings more accurately in remote sensing images is a critical challenge. First, most networks have poor recognition ability on high resolution images, resulting in blurred boundaries in the segmented building maps. Second, the similarity between buildings and background results in intra-class inconsistency. To address these two problems, we propose an UNet-based network named Context-Transfer-UNet (CT-UNet). Specifically, we design Dense Boundary Block. Dense Block utilizes reuse mechanism to refine features and increase recognition capabilities. Boundary Block introduces the low-level spatial information to solve the fuzzy boundary problem. Then, to handle intra-class inconsistency, we construct Spatial Channel Attention Block. It combines context space information and selects more distinguishable features from space and channel. Finally, we propose an improved loss function to enhance the purpose of loss by adding evaluation indicator. Based on our proposed CT-UNet, we achieve 85.33% mean IoU on the Inria dataset, 91.00% mean IoU on the WHU dataset and 83.92% F1-score on the Massachusetts dataset. The results outperform our baseline (U-Net ResNet-34) by 3.76%, exceed Web-Net by 2.24% and surpass HFSA-Unet by 2.17%.
机译:随着遥感图像的扩散,如何在遥感图像中更准确地进行延伸建筑是一个关键挑战。首先,大多数网络在高分辨率图像上具有较差的识别能力,从而导致分段建筑地图中的模糊边界。其次,建筑物与背景之间的相似性导致课外不一致。为了解决这两个问题,我们提出了一个名为Context-Transfer-UNET(CT-UNET)的基于UNET的网络。具体而言,我们设计密集的边界块。密集块利用重用机制来细化功能并提高识别功能。边界块介绍低级空间信息以解决模糊边界问题。然后,为了处理课堂内不一致,我们构建空间通道注意力块。它组合了上下文空间信息,并从空间和频道中选择更区别的功能。最后,我们提出了一种改进的损失功能,通过添加评价指标来增强损失目的。基于我们提出的CT-UNET,我们在INRIA数据集中实现了85.33%的意思,91.00%的意思是在WHU DataSet上的IO和Massachusetts DataSet上的83.92%F1分数。结果优于我们的基线(U-Net Reset-34)的基线(U-Net Reset-34)以3.76%,超过网网的2.24%并超越HFSA-UNET 2.17%。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号