首页> 外文期刊>Advances in artificial neural systems >Stochastic Search Algorithms for Identification, Optimization, and Training of Artificial Neural Networks
【24h】

Stochastic Search Algorithms for Identification, Optimization, and Training of Artificial Neural Networks

机译:用于识别,优化和训练人工神经网络的随机搜索算法

获取原文
获取原文并翻译 | 示例
           

摘要

This paper presents certain stochastic search algorithms (SSA) suitable for effective identification, optimization, and training of artificial neural networks (ANN). The modified algorithm of nonlinear stochastic search (MN-SDS) has been introduced by the author. Its basic objectives are to improve convergence property of the source defined nonlinear stochastic search (N-SDS) method as per Professor Rastrigin. Having in mind vast range of possible algorithms and procedures a so-called method of stochastic direct search (SDS) has been practiced (in the literature is called stochastic local search-SLS). The MN-SDS convergence property is rather advancing over N-SDS; namely it has even better convergence over range of gradient procedures of optimization. The SDS, that is, SLS, has not been practiced enough in the process of identification, optimization, and training of ANN. Their efficiency in some cases of pure nonlinear systems makes them suitable for optimization and training of ANN. The presented examples illustrate only partially operatively end efficiency of SDS, that is, MN-SDS. For comparative method backpropagation error (BPE) method was used.
机译:本文提出了一些随机搜索算法(SSA),适用于有效识别,优化和训练人工神经网络(ANN)。作者介绍了非线性随机搜索的改进算法(MN-SDS)。其基本目标是根据Rastrigin教授的意见,改进源定义的非线性随机搜索(N-SDS)方法的收敛性。考虑到各种各样的可能算法和过程,已经实践了一种所谓的随机直接搜索(SDS)方法(在文献中称为随机局部搜索-SLS)。 MN-SDS的收敛特性比N-SDS更为先进。也就是说,它在优化的梯度过程范围内甚至具有更好的收敛性。在识别,优化和训练ANN的过程中,尚未充分实践SDS(即SLS)。在某些纯非线性系统中,它们的效率使其适合于ANN的优化和训练。所提供的示例仅说明了SDS(即MN-SDS)的部分有效运行最终效率。作为比较方法,使用了反向传播误差(BPE)方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号