...
首页> 外文期刊>Journal of circuits, systems and computers >Human Activity Recognition-Oriented Incremental Learning with Knowledge Distillation
【24h】

Human Activity Recognition-Oriented Incremental Learning with Knowledge Distillation

机译:人类活动识别导向的渐进式学习与知识蒸馏

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

Recently, a variety of different machine learning methods improve the applicability of activity recognition systems in different scenarios. For many current activity recognition models, it is assumed that all data are prepared well in advance and the device has no storage space limitation. However, the process of the sensor data collection is dynamically changing over time, the activity category may be continuously increasing, and the device has limited storage space. Therefore, in this study, we propose a novel class incremental learning comprehensive solution towards activity recognition with knowledge distillation. Besides, we develop the representative sample selection method to select and update a specific number of preserved old samples. When new activity classes samples arrive, we only need the new classes samples and the representative old samples to preserve the network's performance for old classes while identifying the new class. Finally, we carry out experiments using two different public datasets, and they show good accuracy for old and new categories. Besides, the method can significantly reduce the space required to store old classes samples.
机译:最近,各种不同的机器学习方法提高了不同情景中活动识别系统的适用性。对于许多当前活动识别模型,假设预先准备好所有数据,并且设备没有存储空间限制。然而,传感器数据收集的过程随时间动态变化,活动类别可能是连续增加的,并且该设备具有有限的存储空间。因此,在这项研究中,我们提出了一种新的阶级增量学习综合解决方案,以了解蒸馏的活动识别。此外,我们开发了代表性的样本选择方法来选择和更新特定数量的保存旧样本。当新活动类样本到达时,我们只需要新的类样本和代表性的旧样本,以在识别新类时保留网络对旧课程的性能。最后,我们使用两个不同的公共数据集进行实验,他们对旧类和新的类别表现出良好的准确性。此外,该方法可以显着减少存储旧类样​​本所需的空间。

著录项

  • 来源
    《Journal of circuits, systems and computers》 |2021年第6期|2150096.1-2150096.21|共21页
  • 作者单位

    Huazhong Univ Sci & Technol Natl Engn Res Ctr Big Data Technol & Syst Sch Comp Sci & Technol Serv Comp Technol Wuhan 430074 Peoples R China|Huazhong Univ Sci & Technol Cluster & Grid Comp Lab Big Data Technol & Syst Lab Sch Comp Sci & Technol Serv Syst Lab Wuhan 430074 Peoples R China;

    Muroran Inst Technol Dept Informat & Elect Engn Muroran Hokkaido Japan;

    Muroran Inst Technol Dept Informat & Elect Engn Muroran Hokkaido Japan;

    Huazhong Univ Sci & Technol Natl Engn Res Ctr Big Data Technol & Syst Sch Comp Sci & Technol Serv Comp Technol Wuhan 430074 Peoples R China|Huazhong Univ Sci & Technol Cluster & Grid Comp Lab Big Data Technol & Syst Lab Sch Comp Sci & Technol Serv Syst Lab Wuhan 430074 Peoples R China;

    Huazhong Univ Sci & Technol Natl Engn Res Ctr Big Data Technol & Syst Sch Comp Sci & Technol Serv Comp Technol Wuhan 430074 Peoples R China|Huazhong Univ Sci & Technol Cluster & Grid Comp Lab Big Data Technol & Syst Lab Sch Comp Sci & Technol Serv Syst Lab Wuhan 430074 Peoples R China;

  • 收录信息
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

    Class incremental learning; knowledge distillation; representative samples selection; activity recognition;

    机译:类增量学习;知识蒸馏;代表样本选择;活动识别;

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号