首页> 外文会议>International Conference on Artificial Intelligence in Information and Communication >Adaptive Natural Gradient Method for Learning Neural Networks with Large Data set in Mini-Batch Mode
【24h】

Adaptive Natural Gradient Method for Learning Neural Networks with Large Data set in Mini-Batch Mode

机译:用于在迷你批处理模式下使用大数据学习神经网络的自适应自然梯度方法

获取原文

摘要

Natural gradient learning, which is one of gradient descent learning methods, is known to have ideal convergence properties in the learning of hierarchical machines such as layered neural networks. However, there are a few limitations that degrades its practical usability: necessity of true probability density function of input variables and heavy computational cost due to matrix inversion. Though its adaptive approximation have been developed, it is basically derived for online learning mode, in which a single update is done for a single data sample. Noting that the on-line learning mode is not appropriate for the tasks with huge number of training data, this paper proposes a practical implementation of natural gradient for mini-batch learning mode, which is the most common setting in the real application with large data set. Computational experiments on benchmark datasets shows the efficiency of the proposed methods.
机译:已知是梯度下降学习方法之一的自然梯度学习,该学习具有理想的收敛性,可以在学习层次机器如分层的神经网络的学习中。然而,有一些限制可以降低其实际可用性:由于矩阵反转而导致的输入变量的真正概率密度函数的必要性和重型计算成本。尽管已经开发了其自适应近似,但它基本上导出用于在线学习模式,其中为单个数据样本完成单个更新。注意到在线学习模式不适合具有大量培训数据的任务,提出了迷你批量学习模式的自然梯度的实际实现,这是具有大数据的实际应用中最常见的设置放。基准数据集的计算实验显示了所提出的方法的效率。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号