...
首页> 外文期刊>JMLR: Workshop and Conference Proceedings >Gradient-Based Meta-Learning with Learned Layerwise Metric and Subspace
【24h】

Gradient-Based Meta-Learning with Learned Layerwise Metric and Subspace

机译:基于梯度的META学习,具有学习的分层度量和子空间

获取原文
           

摘要

Gradient-based meta-learning methods leverage gradient descent to learn the commonalities among various tasks. While previous such methods have been successful in meta-learning tasks, they resort to simple gradient descent during meta-testing. Our primary contribution is the MT-net, which enables the meta-learner to learn on each layer’s activation space a subspace that the task-specific learner performs gradient descent on. Additionally, a task-specific learner of an MT-net performs gradient descent with respect to a meta-learned distance metric, which warps the activation space to be more sensitive to task identity. We demonstrate that the dimension of this learned subspace reflects the complexity of the task-specific learner’s adaptation task, and also that our model is less sensitive to the choice of initial learning rates than previous gradient-based meta-learning methods. Our method achieves state-of-the-art or comparable performance on few-shot classification and regression tasks.
机译:基于梯度的元学习方法利用梯度下降来学习各种任务之间的共性。虽然之前这样的方法在元学习任务中取得了成功,但他们在Meta-Testing期间诉诸简单的梯度下降。我们的主要贡献是MT-NET,它使得元学习者能够在每个层的激活空间中学习一款特定于任务的学习者对梯度下降的子空间。另外,MT-NET的特定于任务特定的学习者对相对于元学习距离度量进行梯度下降,其扭曲激活空间对任务标识更敏感。我们展示了这所学习的子空间的维度反映了任务特定的学习者的适应任务的复杂性,并且我们的模型对初始学习率的选择不太敏感,而不是基于梯度的元学习方法。我们的方法在几次拍摄分类和回归任务上实现了最先进的或可比性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号