首页> 外文会议>Pacific Rim knowledge acquisition workshop >Accelerating the Backpropagation Algorithm by Using NMF-Based Method on Deep Neural Networks
【24h】

Accelerating the Backpropagation Algorithm by Using NMF-Based Method on Deep Neural Networks

机译:基于NMF的深神经网络方法加速了BackPropagation算法

获取原文

摘要

Backpropagation (BP) is the most widely used algorithm for the training of deep neural networks (DNN) and is also considered a de facto standard algorithm. However, the BP algorithm often requires a lot of computation time, which remains a major challenge. Thus, to reduce the time complexity of the BP algorithm, several methods have been proposed so far, but few do not apply to the BP algorithm. In the meantime, a new DNN algorithm based on nonnegative matrix factorization (NMF) has been proposed, and the algorithm has different convergence characteristics from the BP algorithm. We found that the NMF-based method could lead to rapid performance improvement in DNNs training, and we developed a technique to accelerate the training time of the BP algorithm. In this paper, we propose a novel training method for accelerating the BP algorithm by using an NMF-based algorithm. Furthermore, we present a technique to boost the efficiency of our proposed method by concurrently training DNNs with the BP and NMF-based algorithms. The experimental results indicate that our method significantly improves the training time of the BP algorithm.
机译:BackPropagation(BP)是最广泛使用的深度神经网络(DNN)训练算法(DNN),也被认为是事实上的标准算法。然而,BP算法通常需要大量的计算时间,这仍然是一个重大挑战。因此,为了降低BP算法的时间复杂性,到目前为止已经提出了几种方法,但很少有不适用于BP算法。同时,已经提出了一种基于非环境矩阵分解(NMF)的新DNN算法,并且该算法具有来自BP算法的不同收敛特性。我们发现基于NMF的方法可以导致DNNS培训的快速性能改善,我们开发了一种加速BP算法训练时间的技术。在本文中,我们提出了一种利用基于NMF的算法加速BP算法的新培训方法。此外,我们介绍了一种通过与基于BP和NMF的算法同时训练DNN来提高我们提出的方法的效率。实验结果表明,我们的方法显着提高了BP算法的训练时间。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号