首页> 外文期刊>IEEE Geoscience and Remote Sensing Letters >PEGNet: Progressive Edge Guidance Network for Semantic Segmentation of Remote Sensing Images
【24h】

PEGNet: Progressive Edge Guidance Network for Semantic Segmentation of Remote Sensing Images

机译:PEGNET:用于遥感图像的语义分割的渐进边导网络

获取原文
获取原文并翻译 | 示例

摘要

Owing to the rapid development of deep neural networks, prominent advances have been recently achieved in the semantic segmentation of remote sensing images. As the vital components of computer vision, semantic segmentation, and edge detection have strong correlation whether in the extracted features or task objective. Prior studies treated edge detection as a postprocessing operation to semantic segmentation, or they implicitly combined the two tasks. We consider that pixels around the edges are easy to be misdivided because of the prevalence of intraclass inconsistencies and interclass indistinctions, which reflect the discriminative ability of models to distinguish different classes. In this letter, we propose a multipath atrous module to first enrich the deep semantic information. Then, we combine the enhanced deep semantic information and dilated edge information generated by canny and morphological operations to obtain edge-region maps via edge-region detection module, which identifies pixels around the edges. Then, we relearn these error-prone pixels using a guidance module for the segmentation branch in a progressive guided manner. Combined with edge and segmentation branches, our progressive edge guidance network achieves an overall accuracy of 91.0% on the ISPRS Vaihingen test set, which is the new state-of-the-art result.
机译:由于深度神经网络的快速发展,最近在遥感图像的语义分割中实现了突出的进步。作为计算机视觉,语义分割和边缘检测的重要组成部分,无论在提取的特征或任务目标中都具有强烈的相关性。先前的研究将边缘检测视为对语义分割的后处理操作,或者它们隐含地组合了这两个任务。我们认为边缘周围的像素很容易被遗漏,因为颅上的不一致和杂项缺陷的普遍性,这反映了模型区分不同类的歧视能力。在这封信中,我们提出了一个多径居住的模块,以便首先丰富深度语义信息。然后,我们将增强的深度语义信息和由Canny和Morphology操作产生的扩张边缘信息组合以获得通过边缘区域检测模块获得边缘区域映射,其识别边缘周围的像素。然后,我们使用逐步引导方式使用用于分割分支的指导模块来重新切换这些错误的像素。结合边缘和分割分支机构,我们的渐进式边缘引导网络在ISPRS Vaihingen测试集上实现了91.0%的总体精度,这是新的最先进的结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号