首页> 外文期刊>Neural Networks and Learning Systems, IEEE Transactions on >Recurrent Neural Networks With External Addressable Long-Term and Working Memory for Learning Long-Term Dependences
【24h】

Recurrent Neural Networks With External Addressable Long-Term and Working Memory for Learning Long-Term Dependences

机译:具有外部可寻址长期和工作记忆的递归神经网络,用于学习长期依赖性

获取原文
获取原文并翻译 | 示例

摘要

Learning long-term dependences (LTDs) with recurrent neural networks (RNNs) is challenging due to their limited internal memories. In this paper, we propose a new external memory architecture for RNNs called an external addressable long-term and working memory (EALWM)-augmented RNN. This architecture has two distinct advantages over existing neural external memory architectures, namely the division of the external memory into two parts-long-term memory and working memory-with both addressable and the capability to learn LTDs without suffering from vanishing gradients with necessary assumptions. The experimental results on algorithm learning, language modeling, and question answering demonstrate that the proposed neural memory architecture is promising for practical applications.
机译:由于递归神经网络(RNN)内部记忆有限,因此学习长期依赖(LTD)颇具挑战性。在本文中,我们为RNN提出了一种新的外部存储器架构,称为外部可寻址的长期工作存储器(EALWM)增强的RNN。与现有的神经外部存储器体系结构相比,该体系结构具有两个明显的优势,即将外部存储器分为两个部分:长期存储器和工作存储器,既具有可寻址性,又具有学习LTD的能力,而不会因必要的假设而逐渐消失。在算法学习,语言建模和问题回答方面的实验结果表明,提出的神经存储体系结构在实际应用中很有希望。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号