首页> 外文期刊>Scientific reports. >Gait Estimation from Anatomical Foot Parameters Measured by a Foot Feature Measurement System using a Deep Neural Network Model
【24h】

Gait Estimation from Anatomical Foot Parameters Measured by a Foot Feature Measurement System using a Deep Neural Network Model

机译:使用深度神经网络模型从脚部特征测量系统测量的解剖学足部参数进行步态估计

获取原文
           

摘要

An accurate and credible measurement of human gait is essential in multiple areas of medical science and rehabilitation. Yet, the methods currently available are not only arduous but also costly. Researchers who investigated the relationship between foot and gait parameters have found that the two parameters are closely interrelated and suggested that measuring foot characteristics can be an alternative to the strenuous quantification currently in use. This study aims to verify the potential of foot characteristics in predicting the actual gait temporo-spatial parameters and to develop a deep neural network (DNN) model that can estimate and quantify the gait temporo-spatial parameters from foot characteristics. The foot features in sitting, standing, and one-leg standing conditions of 42 subjects were used as the input data and gait temporo-spatial parameters at fast, normal, and slow speed were set as the output of the DNN regressor. With the prediction accuracy of 95% or higher, the feasibility of the developed model was verified. This study might be the first in attempting experimental verification of the foot features serving as predictors of individual gait. The DNN regressor will help researchers improve the data pool with less labor and expense when some limitations get properly overcome.
机译:在医疗科学和康复的多个领域中,准确准确地测量人的步态至关重要。然而,当前可用的方法不仅费力而且昂贵。研究脚和步态参数之间关系的研究人员发现,这两个参数密切相关,并建议测量脚的特征可以替代当前使用的剧烈量化。这项研究旨在验证脚部特征在预测实际步态时空参数中的潜力,并开发一种深度神经网络(DNN)模型,该模型可以根据脚部特征估算和量化步态时空参数。将42位受试者的坐姿,站立和单腿站立条件下的脚部特征用作输入数据,并将快,正常和慢速时的步态时空参数设置为DNN回归器的输出。以95%或更高的预测精度,验证了所开发模型的可行性。这项研究可能是尝试对脚步特征进行实验验证的第一个方法,该脚步特征可以预测个体步态。当某些局限性得到克服时,DNN回归器将帮助研究人员以更少的人工和费用来改善数据池。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号