首页> 外文期刊>Pattern recognition letters >Frequency features and GMM-UBM approach for gait-based person identification using smartphone inertial signals'
【24h】

Frequency features and GMM-UBM approach for gait-based person identification using smartphone inertial signals'

机译:频率特征和GMM-UBM方法可使用智能手机惯性信号识别基于步态的人

获取原文
获取原文并翻译 | 示例

摘要

This paper describes the development of a Gait-based Person Identification (GPI) system based on a Gaussian Mixture Model-Universal Background Model (GMM-UBM) approach that uses inertial signals from a smartphone. The system integrates five main modules or steps: signal pre-processing, feature extraction, GMM-UBM training, Maximum A Posteriori (MAP) adaptation, and a comparison module for providing the identified user. This system also integrates new feature extraction strategies proposed recently (Mel Frequency Cepstral Coefficients (MFCCs) and Perceptual Lineal Prediction (PLP) coefficients) for improving the results. This study has been done using the public available dataset called UCI Human Activity Recognition Using Smartphones dataset. A six-fold cross-validation procedure has been carried out, showing the average value for every experiment. The final results demonstrate the capability of the GMM-UBM approach for gait recognition, and show how the PLP coefficients can improve system performance while reducing drastically the number of features (from 561 to 90). The best result shows a User Recognition Error Rate of 34.0% with 30 enrolled users. When reducing the number of enrolled users, the error rate decreases: for a number smaller than six, the URER error becomes lower than 10%. (C) 2016 Elsevier B.V. All rights reserved.
机译:本文介绍了一种基于步态的人员识别(GPI)系统的开发,该系统基于使用智能手机的惯性信号的高斯混合模型-通用背景模型(GMM-UBM)方法。该系统集成了五个主要模块或步骤:信号预处理,特征提取,GMM-UBM训练,最大后验(MAP)适应性以及用于提供已识别用户的比较模块。该系统还集成了最近提出的新特征提取策略(梅尔频率倒谱系数(MFCC)和感知线性预测(PLP)系数)以改善结果。这项研究是使用称为“使用智能手机的UCI人类活动识别”的公共可用数据集完成的。进行了六次交叉验证,显示了每个实验的平均值。最终结果证明了GMM-UBM方法用于步态识别的能力,并显示了PLP系数如何在极大减少特征数量(从561到90)的同时提高系统性能。最佳结果显示30个注册用户的用户识别错误率是34.0%。当减少注册用户数时,错误率降低:对于小于六个的数字,URER错误将小于10%。 (C)2016 Elsevier B.V.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号