...
首页> 外文期刊>Journal of Optimization Theory and Applications >Novel algorithms for noisy minimization problems with applications to neural networks training
【24h】

Novel algorithms for noisy minimization problems with applications to neural networks training

机译:解决噪声最小化问题的新算法及其在神经网络训练中的应用

获取原文
获取原文并翻译 | 示例

摘要

The supervisor and searcher cooperation framework (SSC), introduced in Refs. 1 and 2, provides an effective way to design efficient optimization algorithms combining the desirable features of the two existing ones. This work aims to develop efficient algorithms for a wide range of noisy optimization problems including those posed by feedforward neural networks training. It introduces two basic SSC algorithms. The first seems suited for generic problems. The second is motivated by neural networks training problems. It introduces also inexact variants of the two algorithms, which seem to possess desirable properties. It establishes general theoretical results about the convergence and speed of SSC algorithms and illustrates their appealing attributes through numerical tests on deterministic, stochastic, and neural networks training problems.
机译:参考文献中介绍的主管和搜索者合作框架(SSC)。图1和图2提供了一种有效的方式来设计结合两个现有算法的期望特征的高效优化算法。这项工作旨在针对各种嘈杂的优化问题(包括前馈神经网络训练所带来的问题)开发有效的算法。它介绍了两种基本的SSC算法。第一个似乎适合一般性问题。第二个原因是神经网络训练问题。它还引入了这两种算法的不精确变体,它们似乎具有所需的属性。它建立了关于SSC算法的收敛性和速度的一般理论结果,并通过对确定性,随机和神经网络训练问题的数值测试来说明它们的吸引人的属性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号