首页> 外文期刊>Neural computation >Improving Generalization Performance of Natural Gradient Learning Using Optimized Regularization by NIC
【24h】

Improving Generalization Performance of Natural Gradient Learning Using Optimized Regularization by NIC

机译:使用NIC优化正则化来提高自然梯度学习的泛化性能

获取原文
获取原文并翻译 | 示例
获取外文期刊封面目录资料

摘要

Natural gradient learning is known to be efficient in escaping plateau, which is a main cause of the slow learning speed of neural networks. The adaptive natural gradient learning method for practical implementation also has been developed, and its advantage in real-world problems has been confirmed. In this letter, we deal with the generalization performances of the natural gradient method. Since natural gradient learning makes parameters fit to training data quickly, the overfitting phenomenon may easily occur, which results in poor generalization performance. To solve the problem, we introduce the regularization term in natural gradient learning and propose an efficient optimizing method for the scale of regularization by using a generalized Akaike information criterion (network information criterion). We discuss the properties of the optimized regularization strength by NIC through theoretical analysis as well as computer simulations. We confirm the computational efficiency and generalization performance of the proposed method in real-world applications through computational experiments on benchmark problems.
机译:众所周知,自然梯度学习在逃避高原方面是有效的,这是神经网络学习速度缓慢的主要原因。还开发了用于实际实现的自适应自然梯度学习方法,并已证实其在实际问题中的优势。在这封信中,我们讨论了自然梯度法的泛化性能。由于自然梯度学习使参数快速适合训练数据,因此可能容易发生过度拟合现象,从而导致综合性能不佳。为了解决该问题,我们在自然梯度学习中引入了正则化项,并通过使用广义的Akaike信息准则(网络信息准则)为正则化的规模提出了一种有效的优化方法。我们将通过理论分析和计算机仿真来讨论通过NIC优化的正则化强度的特性。通过基准问题的计算实验,我们证实了该方法在实际应用中的计算效率和泛化性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号