首页> 外文会议>Iberian conference on pattern recognition and image analysis >Recognizing Activities of Daily Living from Egocentric Images
【24h】

Recognizing Activities of Daily Living from Egocentric Images

机译:从以自我为中心的图像中识别日常生活活动

获取原文

摘要

Recognizing Activities of Daily Living (ADLs) has a large number of health applications, such as characterize lifestyle for habit improvement, nursing and rehabilitation services. Wearable cameras can daily gather large amounts of image data that provide rich visual information about ADLs than using other wearable sensors. In this paper, we explore the classification of ADLs from images captured by low temporal resolution wearable camera (2 fpm) by using a Convolutional Neural Networks (CNN) approach. We show that the classification accuracy of a CNN largely improves when its output is combined, through a random decision forest, with contextual information from a fully connected layer. The proposed method was tested on a subset of the NTCIR-12 egocentric dataset, consisting of 18,674 images and achieved an overall accuracy of 86% activity recognition on 21 classes.
机译:识别日常生活活动(ADL)具有大量的健康应用,例如表征生活方式以改善习惯,护理和康复服务。与使用其他可穿戴式传感器相比,可穿戴式相机每天可以收集大量图像数据,这些图像数据可提供有关ADL的丰富视觉信息。在本文中,我们使用卷积神经网络(CNN)方法从低时间分辨率可穿戴式摄像机(2 fpm)捕获的图像中探索ADL的分类。我们显示,通过随机决策森林将CNN的输出与来自完全连接层的上下文信息进行组合时,CNN的分类准确性将大大提高。该方法在NTCIR-12以自我为中心的数据集的一个子集上进行了测试,该数据集包含18,674张图像,对21个类别的活动识别的总体准确性达到86%。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号