首页> 外文会议>International Conference on Machine Learning >Nonlinear Hebbian learning as a universal principle in unsupervised feature learning
【24h】

Nonlinear Hebbian learning as a universal principle in unsupervised feature learning

机译:非线性Hebbian学习作为无监督特征学习的普遍原则

获取原文
获取外文期刊封面目录资料

摘要

Successful representation learning models appear to develop strikingly similar features to each other, raising the prospect of a fundamental underlying principle. We show that nonlinear Hebbian learning gives a parsimonious account for feature learning, underlying models such as sparse coding, neural networks and independent component analysis. For all datasets considered, the most hyper-Gaussian features are learned irrespective of the effective nonlinearity of the model. Particularly, it explains why Gabor filters are ubiquitously developed for image inputs. Our results reveal that feature learning is robust to normative assumptions, exposing a large class of models with comparable learning properties.
机译:成功的代表学习模型似乎互相发展着相似的特征,提高了基本潜在原则的前景。我们表明非线性Hebbian学习为特征学习提供了一个解析的帐户,潜在的模型,如稀疏编码,神经网络和独立分量分析。对于所考虑的所有数据集,无论模型的有效非线性如何,都会了解最多高斯的特征。特别是,它解释了为什么Gabor过滤器普遍开发用于图像输入。我们的结果表明,特征学习对规范假设具有强大的稳健性,揭示了具有可比学习特性的大类模型。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号