首页> 外文会议>Asian conference on remote sensing;ACRS >Comparison Between UNet, Modified UNet and Dense-Attention Network (DAN) for Building Extraction from TripleSat Imagery
【24h】

Comparison Between UNet, Modified UNet and Dense-Attention Network (DAN) for Building Extraction from TripleSat Imagery

机译:用于从TripleSat影像中提取建筑物的UNet,改进的UNet和密集注意网络(DAN)之间的比较

获取原文

摘要

Building extraction from high resolution remote sensing imagery is of great importance for land use analysis, urban planning, and many other applications. Notably convolutional neural networks (CNN) have shown significant advantage over traditional methods for this task. Among various CNN models, UNet has gained popularity due to its simplicity, efficiency and robustness while many modified versions have been proposed. More recently, a model called the dense attention network (DAN) based on DenseNets and attention mechanism was proposed. This model achieved good performance in building extraction from very high resolution imagery. Based on these developments, in this paper, we compared three architectures (UNet, modified UNet (with residual blocks and recurrent feature), and DAN) for building extraction in Kuala Lumpur, Malaysia using 0.8m TripleSat imagery. For the modified UNet. skip connections were implemented in each encoder blocks to mix features of different levels. Output was multiplied to input and feed to the same UNet again. The comparison results showed that the modified UNet achieved the highest Fl-score. while the DAN achieved average higher Fl-score than the UNet. But DAN had the highest accuracy for validation patches with large buildings.
机译:从高分辨率遥感影像中提取建筑物对于土地使用分析,城市规划和许多其他应用非常重要。值得注意的是,卷积神经网络(CNN)在此任务上显示出优于传统方法的显着优势。在各种CNN模型中,UNet由于其简单性,效率和鲁棒性而广受欢迎,同时已经提出了许多修改版本。最近,提出了一种基于DenseNets和注意力机制的称为密集注意力网络(DAN)的模型。该模型在从超高分辨率图像中提取建筑物时取得了良好的性能。基于这些进展,在本文中,我们使用0.8m TripleSat图像比较了三种架构(UNet,改进的UNet(具有残差块和循环特征)和DAN)在马来西亚吉隆坡进行建筑物提取。对于修改后的UNet。在每个编码器模块中实现了跳过连接,以混合不同级别的功能。输出乘以输入,然后再次馈入同一UNet。比较结果表明,改进的UNet获得了最高的F1分数。而DAN的平均Fl得分高于UNet。但是DAN对于大型建筑物的验证补丁具有最高的准确性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号