首页> 外文期刊>Knowledge-Based Systems >Prior-knowledge and attention based meta-learning for few-shot learning
【24h】

Prior-knowledge and attention based meta-learning for few-shot learning

机译:基于事先知识和关注的META学习,用于几次拍摄学习

获取原文
获取原文并翻译 | 示例
           

摘要

Recently, meta-learning has been shown to be a promising way to solve few-shot learning. In this paper, inspired by the human cognition process, which utilizes both prior-knowledge and visual attention when learning new knowledge, we present a novel paradigm of meta-learning approach that capitalizes on three developments to introduce attention mechanism and prior-knowledge to meta-learning. In our approach, prior-knowledge is responsible for helping the meta-learner express the input data in a high-level representation space, and the attention mechanism enables the meta-learner to focus on key data features in the representation space. Compared with the existing meta-learning approaches that pay little attention to prior-knowledge and visual attention, our approach alleviates the meta-learner's few-shot cognition burden. Furthermore, we discover a Task-Over-Fitting (TOF) problem,(1) which indicates that the meta-learner has poor generalization across different K-shot learning tasks. To model the TOF problem, we propose a novel Cross-Entropy across Tasks (CET) metric.(2) Extensive experiments demonstrate that our techniques improve the meta-learner to state-of-the-art performance on several few-shot learning benchmarks while also substantially alleviating the TOF problem. (C) 2020 Elsevier B.V. All rights reserved.
机译:最近,元学习被证明是解决几枪学习的有希望的方法。在本文中,受到人类认知过程的启发,它在学习新知识时利用先前知识和视觉关注,我们提出了一种新的元学习方法范式,可以利用三个发展,引入注意机制和前瞻性知识-学习。在我们的方法中,事先知识负责帮助元学习者在高级表示空间中表达输入数据,并且注意机制使得元学习者能够专注于表示空间中的关键数据特征。与现有的元学习方法相比,几乎没有注意事先知识和视觉关注,我们的方法减轻了元学习者的几次射门认知负担。此外,我们发现一个任务过度拟合(TOF)问题,(1)表示元学习者在不同K-Shot学习任务中具有较差的概括。为了模拟TOF问题,我们提出了跨任务(CET)度量的新型跨熵。(2)广泛的实验表明,我们的技术将Meta学习者改善了最先进的绩效,在几秒钟的学习基准上提高了最先进的性能虽然大大缓解了TOF问题。 (c)2020 Elsevier B.v.保留所有权利。

著录项

  • 来源
    《Knowledge-Based Systems》 |2021年第15期|106609.1-106609.12|共12页
  • 作者单位

    Northwestern Polytech Univ Xian 710129 Peoples R China;

    Northwestern Polytech Univ Xian 710129 Peoples R China;

    MiningLamp Technol Beijing 100094 Peoples R China;

    Beijing Kwai Technol Beijing 102600 Peoples R China;

    Chinese Acad Sci Inst Automat Natl Lab Pattern Recognit NLPR Beijing 100000 Peoples R China;

    Northwestern Polytech Univ Xian 710129 Peoples R China;

    Huawei Cloud Seattle WA 90876 USA;

    Chinese Acad Sci Inst Automat Natl Lab Pattern Recognit NLPR Beijing 100000 Peoples R China|Univ Chinese Acad Sci Sch Artificial Intelligence Beijing 100049 Peoples R China;

  • 收录信息
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

    Meta-learning; Few-shot learning; Prior-knowledge; Representation; Attention mechanism;

    机译:Meta-Learning;很少拍摄的学习;事先知识;代表;注意机制;
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号