首页> 外文会议>International Joint Conference on Neural Networks >A systematic algorithm to escape from local minima in training feed-forward neural networks
【24h】

A systematic algorithm to escape from local minima in training feed-forward neural networks

机译:训练前馈神经网络中逃避局部极小值的系统算法

获取原文

摘要

A learning process is easily trapped into a local minimum when training multi-layer feed-forward neural networks. An algorithm called Wrong Output Modification (WOM) was proposed to help a learning process escape from local minima, but WOM still cannot totally solve the local minimum problem. Moreover, there is no performance analysis to show that the learning has a higher probability of converging to a global solution by using this algorithm. Additionally, the generalization performance of this algorithm was not investigated when the early stopping method of training is applied. Based on these limitations of WOM, we propose a new algorithm to ensure the learning can escape from local minima, and its performance is analyzed. We also test the generalization performance of this new algorithm when the early stopping method of training is applied.
机译:训练多层前馈神经网络时,学习过程很容易陷入局部最小值。提出了一种称为错误输出修改(WOM)的算法,以帮助学习过程摆脱局部极小值,但是WOM仍不能完全解决局部极小问题。此外,没有性能分析表明使用该算法可以使学习收敛到全局解的可能性更高。此外,当应用训练的早期停止方法时,未研究此算法的泛化性能。基于WOM的这些局限性,我们提出了一种新的算法来确保学习能够脱离局部极小值,并对其性能进行了分析。当应用训练的早期停止方法时,我们还测试了该新算法的泛化性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号