首页> 外文期刊>Neurocomputing >Deterministic convergence of complex mini-batch gradient learning algorithm for fully complex-valued neural networks
【24h】

Deterministic convergence of complex mini-batch gradient learning algorithm for fully complex-valued neural networks

机译:复杂微批梯度学习算法的确定性收敛性全复数神经网络

获取原文
获取原文并翻译 | 示例
           

摘要

This paper investigates the fully complex mini-batch gradient algorithm for training complex-valued neural networks. Mini-batch gradient method has been widely used in neural network training, however, its convergence analysis is usually restricted to real-valued neural networks and of probability nature. By introducing a new Taylor mean value theorem for analytic functions, in this paper we establish determin-istic convergence results for the fully complex mini-batch gradient algorithm under mild conditions. The deterministic convergence here means that the algorithm will deterministically converge, and both the weak convergence and strong convergence will be proved. Benefited from the newly introduced mean value theorem, our results are of global nature in that they are valid for arbitrarily given initial values of the weights. The theoretical findings are validated with a simulation example. (C) 2020 Elsevier B.V. All rights reserved.
机译:本文研究了训练复合性神经网络的全复杂迷你批量梯度算法。迷你批量梯度法已广泛用于神经网络培训,然而,其收敛性分析通常仅限于实际的神经网络和概率性质。通过引入用于分析函数的新泰勒平均值定理,本文在温和条件下建立了全复杂的迷你批量梯度算法的确定性 - Istic收敛结果。这里的确定性收敛意味着该算法将确定地收敛,并且将证明弱收敛性和强大的收敛。受益于新引入的平均值定理,我们的结果是全球性质,因为它们有效地对重量的初始值有效。理论发现用仿真示例验证。 (c)2020 Elsevier B.v.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号