首页> 外文会议>International Joint Conference on Neural Networks >Using recurrent networks for non-temporal classification tasks
【24h】

Using recurrent networks for non-temporal classification tasks

机译:将递归网络用于非时间分类任务

获取原文

摘要

In recent years, deep neural networks have led to considerable advances in the performance of neural network architectures. However, deep architectures tend to have a large numbers of parameters, leading to long training times and the need for huge amounts of training data and regularization. In addition, biological neural networks make extensive use of recurrent and feedback connections, which are absent for most commonly used deep architectures. In this paper, we investigate the use of recurrent neural networks as an alternative to deep architectures. The approach replaces depth with recurrent computations through time. It can also be seen as a deep architecture with parameter tying. We show that for a comparable numbers of parameters or complexity, replacing depth with recurrency can result in improved performance.
机译:近年来,深度神经网络导致神经网络架构的性能相当大的进步。然而,深深的架构往往具有大量参数,导致长期培训时间和需要大量的培训数据和正规化。此外,生物神经网络的广泛使用经常性和反馈连接,这对于最常用的深层架构不存在。在本文中,我们调查了经常性神经网络作为深层架构的替代品。该方法通过时间替换经常性计算的深度。它也可以被视为具有参数捆绑的深层架构。我们表明,对于可比的参数或复杂性,替换具有复发性的深度可能会导致性能提高。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号