首页> 外文会议> >Global descent replaces gradient descent to avoid local minima problem in learning with artificial neural networks
【24h】

Global descent replaces gradient descent to avoid local minima problem in learning with artificial neural networks

机译:全局下降取代了梯度下降,以避免在人工神经网络学习中出现局部极小问题

获取原文

摘要

One of the fundamental limitations of artificial neural network learning by gradient descent is the susceptibility to local minima during training. A new approach to learning is presented in which the gradient descent rule in the backpropagation learning algorithm is replaced with a novel global descent formalism. This methodology is based on a global optimization scheme, acronymed TRUST (terminal repeller unconstrained subenergy tunneling), which formulates optimization in terms of the flow of a special deterministic dynamical system. The ability of the new dynamical system to overcome local minima with common benchmark examples and a pattern recognition example is tested. The results demonstrate that the new method does indeed escape encountered local minima, and thus finds the global minimum solution to the specific problems.
机译:通过梯度下降进行人工神经网络学习的基本限制之一是在训练过程中对局部极小值的敏感性。提出了一种新的学习方法,其中将反向传播学习算法中的梯度下降规则替换为新颖的全局下降形式。该方法基于全球优化方案,缩写为TRUST(终端推斥子无约束子能量隧穿),该方案根据特殊确定性动力学系统的流程制定了优化方案。测试了新的动力学系统克服常见的基准示例和模式识别示例的局部最小值的能力。结果表明,该新方法确实能够逃脱遇到的局部最小值,从而找到了针对特定问题的全局最小解。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号