首页> 外文期刊>Neurocomputing >Feature learning for stacked ELM via low-rank matrix factorization
【24h】

Feature learning for stacked ELM via low-rank matrix factorization

机译:通过低秩矩阵分解的堆叠榆树的特征学习

获取原文
获取原文并翻译 | 示例

摘要

Extreme-learning-machine based auto-encoder (ELM-AE) is regarded as a useful architecture with fast learning speed and general approximation ability, and stacked ELM is used to develop efficient and effective deep learning networks. However, considering features learned from conventional ELM-AEs have issues of weak nonlinear representation ability and random factors in feature projection, this paper proposes an improved ELM-AE architecture which utilize low-rank matrix factorization to learn optimal lowdimensional features. Two superiorities can be obtained compared to conventional ELM-AEs. One is the dimensionality of the hidden layer in ELM-AE could be set arbitrarily, e.g. a higher-dimension hidden layer could lower the random effect in feature learning and enhance features representation ability. The other is enhancing features nonlinear ability, since features are learned directly from the nonlinear outputs of hidden layer. Finally, comparison experiments on numerical and image datasets are implemented in this paper to verify the superior performance of the proposed ELM-AE in this paper.
机译:基于极端的学习机自动编码器(ELM-AE)被视为具有快速学习速度和一般近似能力的有用体系结构,并且堆叠ELM用于开发高效且有效的深度学习网络。然而,考虑到从传统的ELM-AES学到的特征具有弱非线性表示能力和特征投影中的随机因子的问题,提出了一种利用低秩矩阵分解来学习最佳低尺度特征的ELM-AE架构。与常规ELM-AES相比,可以获得两种优越性。一个是ELM-AE中隐藏层的维度可以是任意的,例如,可以被设定。更高尺寸的隐藏层可能降低特征学习中的随机效果,并增强特征表示能力。另一个正在增强具有非线性能力的特征,因为特征直接从隐藏层的非线性输出学习。最后,本文实施了数值和图像数据集上的比较实验,以验证本文所提出的ELM-AE的优异性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号