...
首页> 外文期刊>Signal Processing, IEEE Transactions on >Learning Topology and Dynamics of Large Recurrent Neural Networks
【24h】

Learning Topology and Dynamics of Large Recurrent Neural Networks

机译:大型递归神经网络的学习拓扑和动力学

获取原文
获取原文并翻译 | 示例
           

摘要

Large-scale recurrent networks have drawn increasing attention recently because of their capabilities in modeling a large variety of real-world phenomena and physical mechanisms. This paper studies how to identify all authentic connections and estimate system parameters of a recurrent network, given a sequence of node observations. This task becomes extremely challenging in modern network applications, because the available observations are usually very noisy and limited, and the associated dynamical system is strongly nonlinear. By formulating the problem as multivariate sparse sigmoidal regression, we develop simple-to-implement network learning algorithms, with rigorous convergence guarantee in theory, for a variety of sparsity-promoting penalty forms. A quantile variant of progressive recurrent network screening is proposed for efficient computation and allows for direct cardinality control of network topology in estimation. Moreover, we investigate recurrent network stability conditions in Lyapunov's sense, and integrate such stability constraints into sparse network learning. Experiments show excellent performance of the proposed algorithms in network topology identification and forecasting.
机译:大型递归网络由于具有建模各种现实世界现象和物理机制的能力,最近已引起越来越多的关注。本文研究了在给定节点观察序列的情况下如何识别所有真实连接并估计循环网络的系统参数。在现代网络应用中,此任务变得极具挑战性,因为可用的观测值通常非常嘈杂且有限,并且相关的动力学系统是强非线性的。通过将问题表述为多元稀疏S形回归,我们开发了易于实现的网络学习算法,在理论上具有严格的收敛性保证,可用于各种稀疏性惩罚模型。提出了递归递归网络筛选的分位数变体以进行有效计算,并允许在估算中对网络拓扑进行直接基数控制。此外,我们从Lyapunov的角度研究递归网络稳定性条件,并将这种稳定性约束整合到稀疏网络学习中。实验表明,所提算法在网络拓扑识别和预测中具有优异的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号