首页> 外文期刊>IFAC PapersOnLine >Overcoming the curse of dimensionality for approximating Lyapunov functions with deep neural networks under a small-gain condition
【24h】

Overcoming the curse of dimensionality for approximating Lyapunov functions with deep neural networks under a small-gain condition

机译:克服近似Lyapunov功能的维度诅咒,在小增益条件下具有深层神经网络

获取原文
       

摘要

We propose a deep neural network architecture for storing approximate Lyapunov functions of systems of ordinary differential equations. Under a small-gain condition on the system, the number of neurons needed for an approximation of a Lyapunov function with fixed accuracy grows only polynomially in the state dimension, i.e., the proposed approach is able to overcome the curse of dimensionality.
机译:我们提出了一种深度神经网络架构,用于存储常微分方程系统的近似Lyapunov功能。 在系统上的小增益条件下,近似Lyapunov函数所需的神经元的数量仅在状态尺寸中增加多项式,即,所提出的方法能够克服维度的诅咒。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号