首页> 外文期刊>IEEE Transactions on Fuzzy Systems >An Efficient Optimization Method for Improving Generalization Performance of Fuzzy Neural Networks
【24h】

An Efficient Optimization Method for Improving Generalization Performance of Fuzzy Neural Networks

机译:一种提高模糊神经网络泛化性能的有效优化方法

获取原文
获取原文并翻译 | 示例

摘要

Fuzzy neural networks (FNNs), with suitable structures, have been demonstrated to be an effective tool in approximating nonlinearity between input and output variables. However, it is time-consuming to construct an FNNwith appropriate number of fuzzy rules to ensure its generalization ability. To solve this problem, an efficient optimization technique is introduced in this paper. First, a self-adaptive structural optimal algorithm (SASOA) is developed to minimize the structural risk of an FNN, leading to an improved generalization performance. Second, with the proposed SASOA, the fuzzy rules of SASOA-based FNN (SASOA-FNN) are generated or pruned systematically. This SASOA-FNN is able to organize the structure and adjust the parameters simultaneously in the learning process. Third, the convergence of SASOA-FNN is proved in the cases with fixed and updated structures, and the guidelines for selecting the parameters are given. Finally, experimental studies of the proposed SASOA-FNN have been performed on several nonlinear systems to verify the effectiveness. The comparison with other existing methods has been made, and it demonstrates that the proposed SASOA-FNN is of better performance.
机译:具有适当结构的模糊神经网络(FNN)已被证明是逼近输入和输出变量之间非线性的有效工具。但是,构造具有适当数量的模糊规则的FNN以确保其泛化能力非常耗时。为了解决这个问题,本文介绍了一种有效的优化技术。首先,开发了一种自适应结构优化算法(SASOA),以最小化FNN的结构风险,从而提高了泛化性能。其次,利用提出的SASOA,系统地生成或修剪了基于SASOA的FNN(SASOA-FNN)的模糊规则。 SASOA-FNN能够在学习过程中同时组织结构和调整参数。第三,在结构固定和更新的情况下证明了SASOA-FNN的收敛性,并给出了选择参数的准则。最后,在几种非线性系统上对提出的SASOA-FNN进行了实验研究,以验证其有效性。与其他现有方法进行了比较,表明所提出的SASOA-FNN具有更好的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号