首页> 外文会议>IEEE International Conference on Systems, Man, and Cybernetics >Multimodal Noisy Segmentation based fragmented burn scars identification in Amazon Rainforest
【24h】

Multimodal Noisy Segmentation based fragmented burn scars identification in Amazon Rainforest

机译:基于多模式的嘈杂分割亚马逊雨林中的碎片烧伤疤痕识别

获取原文

摘要

Detection of burn marks due to wildfires in inaccessible rain forests is important for various disaster management and ecological studies. Diverse cropping patterns and the fragmented nature of arable landscapes amidst similar looking land patterns often thwart the precise mapping of burn scars. Recent advances in remote-sensing and availability of multimodal data offer a viable time-sensitive solution to classical methods, which often requires human expert intervention. However, computer vision based segmentation methods have not been used, largely due to lack of labelled datasets.In this work we present AmazonNET – a convolutional based network that allows extracting of burn patters from multimodal remote sensing images. The network consists of UNet- a well-known encoder decoder type of architecture with skip connections commonly used in biomedical segmentation. The proposed framework utilises stacked RGB-NIR channels to segment burn scars from the pastures by training on a new weakly labelled noisy dataset from Amazonia.Our model illustrates superior performance by correctly identifying partially labelled burn scars and rejecting incorrectly labelled samples, demonstrating our approach as one of the first to effectively utilise deep learning based segmentation models in multimodal burn scar identification.
机译:在无法进入的雨林中检测由于野火导致的烧伤痕迹对于各种灾害管理和生态研究来说是重要的。在类似看起来的土地模式中,各种裁剪模式和耕地的碎片性质往往会挫败烧伤疤痕的精确映射。多式联数据的遥感和可用性的最新进展为古典方法提供了可行的时间敏感解决方案,这通常需要人类专家干预。但是,基于计算机视觉的分段方法尚未使用,这主要是由于缺少标记的数据集。在此工作中我们展示了Amazonnet - 一种基于卷积的网络,允许从多模式遥感图像中提取烧伤模式。该网络由UNET - 一种众所周知的编码器解码器类型的架构,具有常用于生物医学分段的跳过连接。所提出的框架利用堆叠的RGB-NIR频道来通过培训来自亚马逊的新的弱标记的噪声数据集来分段燃烧伤疤。通过正确识别部分标记的烧伤并拒绝错误标记的样品来说明卓越的性能,证明我们的方法首先是有效利用基于深度学习的多模式燃烧瘢痕识别的分割模型之一。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号