【24h】

Regularization of differential-algebraic equations and recurrent training of neural networks

机译:微分代数方程的正则化和神经网络的递归训练

获取原文
获取原文并翻译 | 示例

摘要

In this paper, a mathematical framework is provided for the analysis of local and global convergence properties of continuous-time recurrent training schemes for dynamical neural networks. This framework combines concepts coming from the theory of differential-algebraic equations (DAEs) with some notions related to non-critical linearly implicit differential equations. The continuous-time analogue of recurrent backpropagation can be seen as a semi-explicit DAE where the differential term is linearly implicit Several regularizations of this DAE lead to singularly perturbed systems for which local convergence results may be proved under mild conditions, while the linearly implicit nature of the resulting equations allows for a study of global convergence.
机译:在本文中,提供了一个数学框架来分析动态神经网络的连续时间递归训练方案的局部和全局收敛性。该框架将来自微分代数方程(DAE)理论的概念与一些与非临界线性隐式微分方程有关的概念结合在一起。递归反向传播的连续时间模拟可以看作是半显式DAE,其中微分项是线性隐式的。此DAE的若干正则化导致奇异摄动的系统,在温和条件下可以证明其局部收敛结果,而线性隐含的所得方程的性质允许对全局收敛进行研究。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号