...
首页> 外文期刊>International journal of remote sensing >SCA-CDNet: a robust siamese correlation-and-attention-based change detection network for bitemporal VHR images
【24h】

SCA-CDNet: a robust siamese correlation-and-attention-based change detection network for bitemporal VHR images

机译:SCA-CDNet: a robust siamese correlation-and-attention-based change detection network for bitemporal VHR images

获取原文
获取原文并翻译 | 示例
           

摘要

ABSTRACT Change detection is a key step in various geographic information applications such as land cover change monitoring, agricultural assessment, natural disaster evaluation, and illegal building investigation. In practice, discovering, or outlining these changes is labour intensive and time-consuming. To address this problem, a novel end-to-end Siamese correlation-and-attention-based change detection network (SCA-CDNet) is proposed for bitemporal very-high-resolution images in this paper. In this method, five strategies are adopted to improve the final change detection results. First, data augmentation is used to reduce the overfitting effectively and improve the generalization ability of the training model. Second, in encoding, classic networks (e.g. ResNet) are introduced to extract the multiscale features of the image and make full use of the existing pretraining weights of the network to reduce the difficulty of subsequent model training. Third, a new correlation module is designed to stack the above bitemporal features correspondingly and extract change features with smaller dimensions. Fourth, an attention model is introduced between the correlation module and the decoder module to make the network pay more attention to areas or channels with a greater effect on change analysis. Fifth, a new weighted cross-entropy loss function is designed, which enables training to focus on error detection and improve the final accuracy of the training model. Finally, extensive experimental results on three public data sets including the evaluation of data augmentation, ablation study, and comparison with the state of the art demonstrate the effectiveness and superiority of our proposed method, achieving an intersection of union (IoU) of 84.15%, 83.50%, and 77.29% on the three data sets, respectively.

著录项

获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号