首页> 外文期刊>Neural processing letters >A Smoothing Algorithm with Constant Learning Rate for Training Two Kinds of Fuzzy Neural Networks and Its Convergence
【24h】

A Smoothing Algorithm with Constant Learning Rate for Training Two Kinds of Fuzzy Neural Networks and Its Convergence

机译:一种恒定学习速率的平滑算法,用于训练两种模糊神经网络及其收敛性

获取原文
获取原文并翻译 | 示例

摘要

In this paper, a smoothing algorithm with constant learning rate is presented for training two kinds of fuzzy neural networks (FNNs): max-product and max-min FNNs. Some weak and strong convergence results for the algorithm are provided with the error function monotonically decreasing, its gradient going to zero, and weight sequence tending to a fixed value during the iteration. Furthermore, conditions for the constant learning rate are specified to guarantee the convergence. Finally, three numerical examples are given to illustrate the feasibility and efficiency of the algorithm and to support the theoretical findings.
机译:在本文中,提出了一种恒定学习率的平滑算法,用于培训两种模糊神经网络(FNNS):MAX-PRODUCT和MAX-MIN FNNS。对算法的一些弱和强大的收敛结果提供了误差函数单调逐渐减小,其梯度将在零期间倾斜到零的零,并且权重序列趋于固定值。此外,指定了恒定学习率的条件以保证收敛。最后,给出了三个数值示例来说明算法的可行性和效率和支持理论发现。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号