首页> 外文期刊>IEEE Transactions on Neural Networks >Reformulated radial basis neural networks trained by gradient descent
【24h】

Reformulated radial basis neural networks trained by gradient descent

机译:梯度下降训练的重构径向基神经网络

获取原文
获取原文并翻译 | 示例
           

摘要

This paper presents an axiomatic approach for constructing radial basis function (RBF) neural networks. This approach results in a broad variety of admissible RBF models, including those employing Gaussian RBFs. The form of the RBFs is determined by a generator function. New RBF models can be developed according to the proposed approach by selecting generator functions other than exponential ones, which lead to Gaussian RBFs. This paper also proposes a supervised learning algorithm based on gradient descent for training reformulated RBF neural networks constructed using the proposed approach. A sensitivity analysis of the proposed algorithm relates the properties of RBFs with the convergence of gradient descent learning. Experiments involving a variety of reformulated RBF networks generated by linear and exponential generator functions indicate that gradient descent learning is simple, easily implementable, and produces RBF networks that perform considerably better than conventional RBF models trained by existing algorithms.
机译:本文提出了一种构建径向基函数(RBF)神经网络的公理方法。这种方法导致了各种可接受的RBF模型,包括那些采用高斯RBF的模型。 RBF的形式由生成器函数确定。根据提出的方法,可以通过选择除指数函数以外的其他生成器函数来开发新的RBF模型,这会导致产生高斯RBF。本文还提出了一种基于梯度下降的有监督学习算法,用于训练使用该方法构造的重构RBF神经网络。该算法的敏感性分析将RBF的特性与梯度下降学习的收敛性联系起来。涉及由线性和指数生成器函数生成的各种重新构造的RBF网络的实验表明,梯度下降学习非常简单,易于实现,并且所产生的RBF网络的性能大大优于现有算法训练的常规RBF模型。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号