...
首页> 外文期刊>Machine Learning >Annealing stochastic approximation Monte Carlo algorithm for neural network training
【24h】

Annealing stochastic approximation Monte Carlo algorithm for neural network training

机译:神经网络训练的退火随机逼近蒙特卡罗算法

获取原文
获取原文并翻译 | 示例

摘要

We propose a general-purpose stochastic optimization algorithm, the so-called annealing stochastic approximation Monte Carlo (ASAMC) algorithm, for neural network training. ASAMC can be regarded as a space annealing version of the stochastic approximation Monte Carlo (SAMC) algorithm. Under mild conditions, we show that ASAMC can converge weakly at a rate of Ω(1/t~(1/2)) toward a neighboring set (in the space of energy) of the global minimizers. ASAMC is compared with simulated annealing, SAMC, and the BFGS algorithm for training MLPs on a number of examples. The numerical results indicate that ASAMC outperforms the other algorithms in both training and test errors. Like other stochastic algorithms, ASAMC requires longer training time than do the gradient-based algorithms. It provides, however, an efficient approach to train MLPs for which the energy landscape is rugged.
机译:我们提出了一种用于神经网络训练的通用随机优化算法,即所谓的退火随机逼近蒙特卡洛(ASAMC)算法。 ASAMC可以看作是随机近似蒙特卡洛(SAMC)算法的空间退火版本。在温和的条件下,我们表明ASAMC可以以Ω(1 / t〜(1/2))的速率向全局最小化子的相邻集合(在能量空间中)弱收敛。在许多示例中,将ASAMC与模拟退火,SAMC和BFGS算法进行了比较,以训练MLP。数值结果表明,ASAMC在训练和测试错误方面均优于其他算法。与其他随机算法一样,ASAMC比基于梯度的算法需要更长的训练时间。但是,它提供了一种有效的方法来训练对能源形势严峻的MLP。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号