首页> 外文会议>Chinese Automation Congress >An Adaptive Natural Gradient Method with Adaptive Step Size in Multilayer Perceptrons
【24h】

An Adaptive Natural Gradient Method with Adaptive Step Size in Multilayer Perceptrons

机译:一种自适应自适应梯度法,具有自适应阶梯尺寸在多层的感觉下

获取原文

摘要

The multilayer perceptrons (MLPs) have been widely used in many communication applications, however, the learning process of the multilayer perceptrons often becomes very slow, which is due to the existence of the singularities in the parameter space. As the singularities significantly affect the learning dynamics of MLPs, the standard gradient descent method is not Fisher efficient. In order to overcome this problem, natural gradient method and adaptive natural gradient method were proposed to accelerate the learning process. As is well known, step size in each iteration plays a key role in the performance of algorithms. In this paper, the modified adaptive natural gradient method is proposed where the step size in each iteration is adaptive modified. The aim of the proposed algorithm is to accelerate the convergence speed and increase the performance of MLPs. The simulation results verify the validity of the analytical results.
机译:多层的感知(MLP)已广泛用于许多通信应用中,然而,多层感知者的学习过程通常变得非常慢,这是由于参数空间中的奇点存在。由于奇点显着影响MLP的学习动态,标准梯度下降方法不是Fisher效率。为了克服这个问题,提出了自然梯度法和自适应自然梯度方法来加速学习过程。众所周知,每次迭代中的步长在算法的性能中起关键作用。在本文中,提出了修改的自适应自然梯度方法,其中每个迭代中的步长是自适应修改的。所提出的算法的目的是加速收敛速度并提高MLP的性能。仿真结果验证了分析结果的有效性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号