首页> 外文期刊>IEEE Transactions on Industrial Electronics >Computing Gradient Vector and Jacobian Matrix in Arbitrarily Connected Neural Networks
【24h】

Computing Gradient Vector and Jacobian Matrix in Arbitrarily Connected Neural Networks

机译:任意连接神经网络中梯度向量和雅可比矩阵的计算

获取原文
获取原文并翻译 | 示例
           

摘要

This paper describes a new algorithm with neuronby-neuron computation methods for the gradient vector and the Jacobian matrix. The algorithm can handle networks with arbitrarily connected neurons. The training speed is comparable with the Levenberg–Marquardt algorithm, which is currently considered by many as the fastest algorithm for neural network training. More importantly, it is shown that the computation of the Jacobian, which is required for second-order algorithms, has a similar computation complexity as the computation of the gradient for first-order learning methods. This new algorithm is implemented in the newly developed software, Neural Network Trainer, which has unique capabilities of handling arbitrarily connected networks. These networks with connections across layers can be more efficient than commonly used multilayer perceptron networks.
机译:本文介绍了一种采用神经元-神经元计算方法的梯度向量和雅可比矩阵新算法。该算法可以处理具有任意连接的神经元的网络。训练速度可与Levenberg-Marquardt算法相媲美,该算法目前被许多人视为神经网络训练最快的算法。更重要的是,它表明,二阶算法所需的雅可比行列式的计算复杂度与一阶学习方法的梯度计算相似。此新算法在新开发的软件Neural Network Trainer中实现,该软件具有处理任意连接的网络的独特功能。这些跨层连接的网络比常用的多层感知器网络更有效。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号