为获得快速、准确而精简的模糊神经网络,提出一种连续增量学习模糊神经网络( ISL-FNN)。将修剪策略引入到神经元的产生过程,用错误下降率定义输入数据对系统输出的影响并应用于神经元的增长过程。在参数的学习阶段,所有隐含层神经元(无论是新增还是已有)的参数使用扩展的卡尔曼算法更新。通过仿真实验,该算法在达到与其它算法性能相当甚至更好的情况下,能获得更精简的结构。%To gain a fast, accurate and parsimonious fuzzy neural network, an effective incremental sequential learning algorithm for parsimonious fuzzy neural networks ( ISL-FNN) is proposed. The pruning strategy is introduced into the generation of neurons. The error reduction ratio is used to define the influence of input data on the output and the influence is utilized for the generation of neurons. In the parameter learning phase, all the free parameters of hidden units, including the newly created and the originally existing, are updated by the extended Kalman filter method. The performance of ISL-FNN is compared with several existing algorithms on some benchmark problems. Result indicates that ISL-FNN produces similar or even better accuracies with less number of rules.
展开▼