首页> 外文会议>IEEE International Conference on Robotics and Automation >Real-time inertial lower body kinematics and ground contact estimation at anatomical foot points for agile human locomotion
【24h】

Real-time inertial lower body kinematics and ground contact estimation at anatomical foot points for agile human locomotion

机译:敏捷人体运动的实时惯性下体运动学和地面接触点在解剖学脚点的估计

获取原文

摘要

The ability to accurately capture locomotion is relevant in various use cases, in particular in the sports and health area. With the major goal of providing a measurement system that can deliver different types of relevant information (3D body segment kinematics, spatiotemporal locomotion parameters, and locomotion patterns) in-field and in real-time, we propose a novel probabilistic (single-plane) ground contact estimation method, using four contact points defined through a biomechanical foot model, and integrate this into an existing inertial motion capturing method. The resulting method is quantitatively evaluated on simulated and real IMU data in comparison to an optical motion capture system on walking, running, and jumping sequences. The results show its ability to maintain a good average 3D kinematics estimation error on low- and high-acceleration locomotion, whereas many previous accuracy studies restrict themselves to movements with low to moderate global accelerations, such as upper body activities or slow locomotion. Moreover, a qualitative evaluation of the estimated ground contact probabilities demonstrates the method's ability to also provide consistent information also for deriving spatiotemporal locomotion parameters as well as locomotion patterns (e.g., over-pronation/-supination) simultaneously with the 3D kinematics.
机译:准确捕获运动的能力在各种用例中都具有重要意义,尤其是在运动和保健领域。我们的主要目标是提供一种能够实时和实时提供不同类型的相关信息(3D人体运动学,时空运动参数和运动模式)的测量系统,我们提出了一种新颖的概率(单平面)地面接触估计方法,使用通过生物力学足部模型定义的四个接触点,并将其集成到现有的惯性运动捕获方法中。与在步行,跑步和跳跃序列上的光学运动捕捉系统相比,对模拟和实际IMU数据进行定量评估后得出的方法。结果表明,在低速和高加速度运动下,它能够保持良好的平均3D运动学估计误差,而许多以前的精度研究都将其自身限制在低至中等全局加速度的运动下,例如上身运动或慢速运动。此外,对估计的地面接触概率的定性评估表明,该方法还能够提供一致的信息,以与3D运动学同时导出时空运动参数以及运动模式(例如,过度旋前/旋后)。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号