首页> 外文会议>International Conference on Information Science and Technology >An Efficient Hybrid Incremental Algorithm for Complex-Valued Feedforward Neural Networks
【24h】

An Efficient Hybrid Incremental Algorithm for Complex-Valued Feedforward Neural Networks

机译:复合馈电神经网络的高效混合增量算法

获取原文

摘要

Complex back-propagation (CBP) algorithm based on gradient descent is one of the most popular optimization algorithms for training complex-valued feedforward neural networks (CVFNNs). However, it has the disadvantages of slow convergence and easy to fall into local minimum. Most of previous investigations on CVFNNs focused on the improvement of optimization algorithms while the structure was determined by the trial-and-error method, which is a time-consuming process. To solve this issue, an efficient complex-valued hybrid incremental algorithm (CV-HIA) is proposed in this paper for three-layer split CVFNNs. The size of the underlying CVFNN is adaptively determined by an incremental scheme. When the training error falls into local minimum, hidden neurons are added into the network one by one. Nonlinear parameters between neurons of the hidden and input layers are adjusted by the gradient descent method. For linear weights between neurons of the output and hidden layers, the least-square (LS) method is adopted. Experimental results on several benchmark datasets show that the proposed method not only achieves good performance, but also obtains a compact network structure.
机译:基于梯度下降的复杂背传播(CBP)算法是用于训练复值馈电神经网络(CVFNN)的最流行的优化算法之一。然而,它具有缓慢的收敛性和易于陷入局部最小值的缺点。以前关于CVFNN的大多数研究专注于改善优化算法,而结构是通过试验和误差方法确定的,这是耗时的过程。为了解决这个问题,本文提出了一种有效的复合值的混合增量算法(CV-HIA),用于三层分割CVFNN。底层CVFNN的大小由增量方案自适应地确定。当训练误差落入局部最小值时,隐藏的神经元逐个将网络添加到网络中。通过梯度下降方法调整隐藏和输入层的神经元之间的非线性参数。对于输出和隐藏层的神经元之间的线性重量,采用最小二乘(LS)方法。关于多个基准数据集的实验结果表明,该方法不仅达到了良好的性能,还获得了紧凑的网络结构。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号