首页> 外文期刊>Neurocomputing >A fast and accurate online self-organizing scheme for parsimonious fuzzy neural networks
【24h】

A fast and accurate online self-organizing scheme for parsimonious fuzzy neural networks

机译:简约模糊神经网络的快速准确的在线自组织方案

获取原文
获取原文并翻译 | 示例
           

摘要

In this paper, we present a fast and accurate online self-organizing scheme for parsimonious fuzzy neural networks (FAOS-PFNN), where a novel structure learning algorithm incorporating a pruning strategy into new growth criteria is developed. The proposed growing procedure without pruning not only speeds up the online learning process but also facilitates a more parsimonious fuzzy neural network while achieving comparable performance and accuracy by virtue of the growing and pruning strategy. The FAOS-PFNN starts with no hidden neurons and parsimoniously generates new hidden units according to the proposed growth criteria as learning proceeds. In the parameter learning phase, all the free parameters of hidden units, regardless of whether they are newly created or originally existing, are updated by the extended Kalman filter (EKF) method. The effectiveness and superiority of the FAOS-PFNN paradigm is compared with other popular approaches like resource allocation network (RAN), RAN via the extended Kalman filter (RANEKF), minimal resource allocation network (MRAN), adaptive-network-based fuzzy inference system (ANFIS), orthogonal least squares (OLS), RBF-AFS, dynamic fuzzy neural networks (DFNN), generalized DFNN (GDFNN), generalized GAP-RBF (GGAP-RBF), online sequential extreme learning machine (OS-ELM) and self-organizing fuzzy neural network (SOFNN) on various benchmark problems in the areas of function approximation, nonlinear dynamic system identification, chaotic time-series prediction and real-world regression problems. Simulation results demonstrate that the proposed FAOS-PFNN algorithm can achieve faster learning speed and more compact network structure with comparably high accuracy of approximation and generalization.
机译:在本文中,我们提出了一种用于简约模糊神经网络(FAOS-PFNN)的快速,准确的在线自组织方案,其中开发了一种将修剪策略结合到新的增长标准中的新颖结构学习算法。提出的不修剪的增长过程不仅加快了在线学习过程,而且还促进了更简化的模糊神经网络,同时借助增长和修剪策略实现了可比的性能和准确性。 FAOS-PFNN开始时没有隐藏的神经元,并随着学习的进行,根据拟议的增长标准简约地生成了新的隐藏单元。在参数学习阶段,隐藏单元的所有自由参数(无论它们是新创建的还是原始存在的)均通过扩展卡尔曼滤波器(EKF)方法进行更新。将FAOS-PFNN范式的有效性和优越性与其他流行方法进行了比较,例如资源分配网络(RAN),通过扩展卡尔曼滤波器(RANEKF)进行的RAN,最小资源分配网络(MRAN),基于自适应网络的模糊推理系统(ANFIS),正交最小二乘(OLS),RBF-AFS,动态模糊神经网络(DFNN),广义DFNN(GDFNN),广义GAP-RBF(GGAP-RBF),在线顺序极限学习机(OS-ELM)和自组织模糊神经网络(SOFNN),可以解决函数逼近,非线性动态系统识别,混沌时间序列预测和现实世界回归问题中的各种基准问题。仿真结果表明,所提出的FAOS-PFNN算法可以达到更快的学习速度和更紧凑的网络结构,并且逼近和归纳的准确性较高。

著录项

  • 来源
    《Neurocomputing》 |2009年第18期|3818-3829|共12页
  • 作者单位

    Institution of Automation, Dalian Maritime University, Dalian 116026, China School of EEE, Nanyang Technological University, Singapore 639798, Singapore;

    School of EEE, Nanyang Technological University, Singapore 639798, Singapore;

    Institution of Automation, Dalian Maritime University, Dalian 116026, China;

  • 收录信息 美国《科学引文索引》(SCI);美国《工程索引》(EI);
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

    fuzzy neural network (FNN); online self-organizing; extended Kalman filter (EKF); growing criterion;

    机译:模糊神经网络(FNN);在线自组织;扩展卡尔曼滤波器(EKF);成长标准;

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号