首页> 外文会议>International Conference on Information Science and Technology >An Efficient Hybrid Incremental Algorithm for Complex-Valued Feedforward Neural Networks
【24h】

An Efficient Hybrid Incremental Algorithm for Complex-Valued Feedforward Neural Networks

机译:复值前馈神经网络的高效混合增量算法

获取原文

摘要

Complex back-propagation (CBP) algorithm based on gradient descent is one of the most popular optimization algorithms for training complex-valued feedforward neural networks (CVFNNs). However, it has the disadvantages of slow convergence and easy to fall into local minimum. Most of previous investigations on CVFNNs focused on the improvement of optimization algorithms while the structure was determined by the trial-and-error method, which is a time-consuming process. To solve this issue, an efficient complex-valued hybrid incremental algorithm (CV-HIA) is proposed in this paper for three-layer split CVFNNs. The size of the underlying CVFNN is adaptively determined by an incremental scheme. When the training error falls into local minimum, hidden neurons are added into the network one by one. Nonlinear parameters between neurons of the hidden and input layers are adjusted by the gradient descent method. For linear weights between neurons of the output and hidden layers, the least-square (LS) method is adopted. Experimental results on several benchmark datasets show that the proposed method not only achieves good performance, but also obtains a compact network structure.
机译:基于梯度下降的复杂反向传播(CBP)算法是用于训练复值前馈神经网络(CVFNN)的最受欢迎的优化算法之一。但是,它的缺点是收敛速度慢,容易陷入局部最小值。以往对CVFNN的研究大多集中在优化算法的改进上,而结构是通过反复试验法确定的,这是一个耗时的过程。为了解决这个问题,本文针对三层分割CVFNNs提出了一种有效的复值混合增量增量算法(CV-HIA)。基础CVFNN的大小由增量方案自适应确定。当训练误差降到局部最小值时,隐藏的神经元就会被一一添加到网络中。隐藏层和输入层的神经元之间的非线性参数通过梯度下降法进行调整。对于输出层和隐藏层的神经元之间的线性权重,采用最小二乘法(LS)。在多个基准数据集上的实验结果表明,该方法不仅性能良好,而且网络结构紧凑。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号