首页> 外文OA文献 >Extraction method of position and posture information of robot arm picking up target based on RGB-D data
【2h】

Extraction method of position and posture information of robot arm picking up target based on RGB-D data

机译:基于RGB-D数据的机器人手臂拾取目标的位置和姿势信息的提取方法

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

There is a big error in the traditional method to extract the position and attitude information of the robot. In the process of obtaining the target attitude, a method of extracting the target attitude information of robot arm based on RGB-D data is proposed. The position and attitude of the manipulator target are acquired by depth image processing, and the detected target position is sent to the manipulator control node, and the feature points of the manipulator are extracted. The 3-D mapping is carried out on the acquired RGB image, and the depth and RGB values of feature points, as well as position and attitude information are calculated by using the Gauss mixture model. Finally, the target is extracted by combining the covariance matrix of feature points. The experimental results show that the co-ordinate error and angle error of the robot arm extracted by this method are small. The maximum extraction error is only 28%, which is much lower than the traditional method, which shows that the proposed method is more applicable.
机译:传统方法中有很大的错误,以提取机器人的位置和姿态信息。在获得目标姿态的过程中,提出了一种基于RGB-D数据提取机器人臂的目标姿态信息的方法。通过深度图像处理获取操纵器目标的位置和姿态,并且检测到的目标位置被发送到操纵器控制节点,提取机械手的特征点。在获取的RGB图像上执行3-D映射,并且通过使用高斯混合模型计算特征点的深度和RGB值以及位置和姿态信息。最后,通过组合特征点的协方差矩阵来提取目标。实验结果表明,该方法提取的机器人臂的统序误差和角度误差小。最大提取误差仅为28%,远低于传统方法,表明所提出的方法更适用。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号