首页> 外文期刊>Neurocomputing >Cholesky factorization based online regularized and kernelized extreme learning machines with forgetting mechanism
【24h】

Cholesky factorization based online regularized and kernelized extreme learning machines with forgetting mechanism

机译:基于Cholesky分解的带有遗忘机制的在线正则化和核化极限学习机

获取原文
获取原文并翻译 | 示例

摘要

In this paper, we propose two alternative schemes of fast online sequential extreme learning machine (ELM) for training the single hidden-layer feedforward neural networks (SLFN), termed as Choleslcy factorization based online regularized ELM with forgetting mechanism (CF-FORELM) and Cholesky factorization based online kernelized ELM with forgetting mechanism (CF-FOKELM). First, the solutions of regularized ELM (RELM) and kernelized ELM (KELM) using the matrix Cholesky factorization are introduced; then the recursive method for calculating Cholesky factor of involved matrix in RELM and KELM is designed when RELM and KELM are applied to train SLFN online; consequently, the CF-FORELM and CF-FOKELM are obtained. The numerical simulation results show CF-FORELM demands less computational burden than Dynamic Regression ELM (DR-ELM), and CF-FOKELM also owns higher computational efficiency than both FOKELM and online sequential ELM with kernels (OS-ELMK), and CF-FORELM is less sensitive to model parameters than CF-FOKELM. (C) 2015 Elsevier B.V. All rights reserved.
机译:在本文中,我们提出了两种用于训练单隐藏层前馈神经网络(SLFN)的快速在线顺序极限学习机(ELM)的替代方案,分别称为基于带有忘却机制的Choleslcy因子分解的在线正则化ELM(CF-FORELM)和基于Cholesky分解的具有遗忘机制的在线内核ELM(CF-FOKELM)。首先,介绍了使用矩阵Cholesky因子分解的正则化ELM(RELM)和内核化ELM(KELM)的解决方案;然后在将RELM和KELM应用于在线训练SLFN时,设计了一种在RELM和KELM中计算涉及矩阵的霍夫斯基因子的递推方法。结果,获得了CF-FORELM和CF-FOKELM。数值模拟结果表明,CF-FORELM比动态回归ELM(DR-ELM)所需的计算负担更少,并且CF-FOKELM的计算效率也高于FOKELM和带有内核的在线顺序ELM(OS-ELMK)以及CF-FORELM对模型参数的敏感度不如CF-FOKELM。 (C)2015 Elsevier B.V.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号