首页> 外文期刊>IEEE Transactions on Neural Networks >A generalized growing and pruning RBF (GGAP-RBF) neural network for function approximation
【24h】

A generalized growing and pruning RBF (GGAP-RBF) neural network for function approximation

机译:用于函数逼近的广义生长和修剪RBF(GGAP-RBF)神经网络

获取原文
获取原文并翻译 | 示例

摘要

This work presents a new sequential learning algorithm for radial basis function (RBF) networks referred to as generalized growing and pruning algorithm for RBF (GGAP-RBF). The paper first introduces the concept of significance for the hidden neurons and then uses it in the learning algorithm to realize parsimonious networks. The growing and pruning strategy of GGAP-RBF is based on linking the required learning accuracy with the significance of the nearest or intentionally added new neuron. Significance of a neuron is a measure of the average information content of that neuron. The GGAP-RBF algorithm can be used for any arbitrary sampling density for training samples and is derived from a rigorous statistical point of view. Simulation results for bench mark problems in the function approximation area show that the GGAP-RBF outperforms several other sequential learning algorithms in terms of learning speed, network size and generalization performance regardless of the sampling density function of the training data.
机译:这项工作提出了一种针对径向基函数(RBF)网络的新的顺序学习算法,称为针对RBF的广义增长和修剪算法(GGAP-RBF)。本文首先介绍了隐藏神经元的重要性概念,然后将其用于学习算法中以实现简约网络。 GGAP-RBF的生长和修剪策略是基于将所需的学习准确性与最近或有意添加的新神经元的重要性联系起来的。神经元的重要性是该神经元平均信息含量的量度。 GGAP-RBF算法可用于训练样本的任何任意采样密度,并且是从严格的统计角度得出的。函数逼近区域中基准问题的仿真结果表明,无论训练数据的采样密度函数如何,GGAP-RBF在学习速度,网络规模和泛化性能方面均优于其他几种顺序学习算法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号