首页> 外文期刊>Journal of Robotic Systems >Automated crop plant detection based on the fusion of color and depth images for robotic weed control
【24h】

Automated crop plant detection based on the fusion of color and depth images for robotic weed control

机译:基于颜色和深度图像融合的作物自动检测,用于机器人除草

获取原文
获取原文并翻译 | 示例

摘要

Robotic weeding enables weed control near or within crop rows automatically, precisely and effectively. A computer-vision system was developed for detecting crop plants at different growth stages for robotic weed control. Fusion of color images and depth images was investigated as a means of enhancing the detection accuracy of crop plants under conditions of high weed population. In-field images of broccoli and lettuce were acquired 3-27 days after transplanting with a Kinect v2 sensor. The image processing pipeline included data preprocessing, vegetation pixel segmentation, plant extraction, feature extraction, feature-based localization refinement, and crop plant classification. For the detection of broccoli and lettuce, the color-depth fusion algorithm produced high true-positive detection rates (91.7% and 90.8%, respectively) and low average false discovery rates (1.1% and 4.0%, respectively). Mean absolute localization errors of the crop plant stems were 26.8 and 7.4 mm for broccoli and lettuce, respectively. The fusion of color and depth was proved beneficial to the segmentation of crop plants from background, which improved the average segmentation success rates from 87.2% (depth-based) and 76.4% (color-based) to 96.6% for broccoli, and from 74.2% (depth-based) and 81.2% (color-based) to 92.4% for lettuce, respectively. The fusion-based algorithm had reduced performance in detecting crop plants at early growth stages.
机译:机器人除草可以自动,精确,有效地控制作物行附近或行内的杂草。开发了计算机视觉系统,用于检测处于不同生长期的作物,以进行机器人除草。研究了彩色图像和深度图像的融合,以提高在高杂草种群条件下农作物的检测精度。使用Kinect v2传感器移植后3-27天,获取了西兰花和生菜的现场图像。图像处理管道包括数据预处理,植被像素分割,植物提取,特征提取,基于特征的定位细化和农作物分类。对于西兰花和生菜的检测,色深融合算法产生了较高的真阳性检出率(分别为91.7%和90.8%)和较低的平均错误发现率(分别为1.1%和4.0%)。西兰花和生菜的农作物茎的平均绝对定位误差分别为26.8和7.4 mm。事实证明,颜色和深度的融合有利于从背景中分割作物,将西兰花的平均分割成功率从87.2%(基于深度)和76.4%(基于颜色)提高到96.6%,从74.2生菜的百分比(基于深度)和81.2%(基于颜色)至92.4%。基于融合的算法在早期生长阶段检测农作物的性能降低。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号