首页> 外文会议>IEEE International Conference on Neural Networks >Global descent replaces gradient descent to avoid local minima problem in learning with artificial neural networks
【24h】

Global descent replaces gradient descent to avoid local minima problem in learning with artificial neural networks

机译:全球下降替换梯度下降以避免使用人工神经网络学习局部最小问题

获取原文

摘要

One of the fundamental limitations of artificial neural network learning by gradient descent is the susceptibility to local minima during training. A new approach to learning is presented in which the gradient descent rule in the backpropagation learning algorithm is replaced with a novel global descent formalism. This methodology is based on a global optimization scheme, acronymed TRUST (terminal repeller unconstrained subenergy tunneling), which formulates optimization in terms of the flow of a special deterministic dynamical system. The ability of the new dynamical system to overcome local minima with common benchmark examples and a pattern recognition example is tested. The results demonstrate that the new method does indeed escape encountered local minima, and thus finds the global minimum solution to the specific problems.
机译:人工神经网络学习通过梯度下降的基本局限之一是在训练期间对局部最小值的易感性。提出了一种新的学习方法,其中替换了反向化学习算法中的梯度下降规则被替换为新的全球下降形式主义。该方法基于全局优化方案,缩写信任(终端掠夺者无约束的子宫化隧道),其在特殊确定性动态系统的流动方面制定优化。测试新动态系统用公共基准示例克服局部最小值和模式识别示例的能力。结果表明,新方法确实逃避遇到局部最小值,从而找到了对特定问题的全局最低解决方案。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号