首页> 外文会议>International Symposium on Measurement and Control in Robotics >Integrating Human Hand Gestures with Vision Based Feedback Controller to Navigate a Virtual Robotic Arm
【24h】

Integrating Human Hand Gestures with Vision Based Feedback Controller to Navigate a Virtual Robotic Arm

机译:将人的手势与基于视觉的反馈控制器集成在一起,以导航虚拟机器人手臂

获取原文

摘要

This paper reports the design and development of a real-time IMU-Vision-based hybrid control algorithm to interact with a 6-DOF Kinova virtual robotic arm. Human Robot Interaction (HRI) control scheme proposed in this paper utilizes an embedded gyroscopic sensor from a Myo Gesture Control Armband’s inertial measurement unit and an 800*600-pixel resolution from a Microsoft HD camera. The algorithm exploits the mathematical features of a numerical discrete time integrator and a mean filter to process the raw angular velocity data from the gyroscope. The processed data further provides the angular displacements to the end-effector of the robotic arm during clockwise or counterclockwise actions along x, y, and z axes from the user. This also facilitates the end effector (gripper) motion which was also controlled simultaneously by the roll action through threshold comparison in the algorithm. A vision based feedback system was designed using a computer vision toolbox and blob analysis technique in order to make the system more reliable and to control the distance of the end-effector while reaching to desired objects. The results demonstrated a significant control of the 6-DOF virtual robotic limb using the gyroscopic information and user inputs. The virtual robotic arm stopped the movement after reaching 320 mm from the desired object as expected. For three different objects, the maximum error between the real and the measured distance was calculated as 15.3 cm for cylindrical object. Due to its smooth control and arm gesture controller, this technology has the potential to assist people with either physical impairments or neurological disorders to perform activities of daily living using assistive robotic arm in the near future.
机译:本文报告了基于IMU-Vision的实时混合控制算法的设计和开发,该算法可与6自由度Kinova虚拟机器人手臂进行交互。本文提出的人机交互(HRI)控制方案利用了Myo Gesture Control Armband惯性测量单元的嵌入式陀螺仪传感器和Microsoft HD摄像机的800 * 600像素分辨率。该算法利用数字离散时间积分器和均值滤波器的数学特征来处理来自陀螺仪的原始角速度数据。处理后的数据还会在沿来自用户的x,y和z轴的顺时针或逆时针动作期间向机械臂的末端执行器提供角位移。这也促进了末端执行器(夹具)的运动,该运动也通过算法中的阈值比较由滚动动作同时控制。使用计算机视觉工具箱和斑点分析技术设计了基于视觉的反馈系统,以使系统更可靠并在到达所需物体时控制末端执行器的距离。结果表明,使用陀螺仪信息和用户输入可以对6自由度虚拟机器人肢体进行有效控制。虚拟机械臂在按预期距离目标物体达到320毫米后停止了运动。对于三个不同的物体,圆柱物体的实测距离与实测距离之间的最大误差计算为15.3 cm。由于其平滑的控制和手臂手势控制器,该技术有潜力在不久的将来使用辅助机械臂帮助身体有障碍或神经系统疾病的人进行日常生活活动。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号