首页> 外文期刊>Neural computation >Continuous Online Sequence Learning with an Unsupervised Neural Network Model
【24h】

Continuous Online Sequence Learning with an Unsupervised Neural Network Model

机译:无监督神经网络模型的连续在线序列学习

获取原文
获取原文并翻译 | 示例

摘要

The ability to recognize and predict temporal sequences of sensory inputs is vital for survival in natural environments. Based on many known properties of cortical neurons, hierarchical temporal memory (HTM) sequence memory recently has been proposed as a theoretical framework for sequence learning in the cortex. In this letter, we analyze properties of HTM sequence memory and apply it to sequence learning and prediction problems with streaming data. We show the model is able to continuously learn a large number of variable order temporal sequences using an unsupervised Hebbian-like learning rule. The sparse temporal codes formed by the model can robustly handle branching temporal sequences by maintaining multiple predictions until there is sufficient disambiguating evidence. We compare the HTM sequence memory with other sequence learning algorithms, including statistical methods—autoregressive integrated moving average; feedforward neural networks—time delay neural network and online sequential extreme learning machine; and recurrent neural networks—long short-term memory and echo-state networks on sequence prediction problems with both artificial and real-world data. The HTM model achieves comparable accuracy to other state-of-the-art algorithms. The model also exhibits properties that are critical for sequence learning, including continuous online learning, the ability to handle multiple predictions and branching sequences with high-order statistics, robustness to sensor noise and fault tolerance, and good performance without task-specific hyperparameter tuning. Therefore, the HTM sequence memory not only advances our understanding of how the brain may solve the sequence learning problem but is also applicable to real-world sequence learning problems from continuous data streams.
机译:识别和预测感觉输入的时间顺序的能力对于在自然环境中生存至关重要。基于皮质神经元的许多已知特性,最近已经提出了分层时间记忆(HTM)序列记忆作为皮质中序列学习的理论框架。在这封信中,我们分析了HTM序列存储器的属性,并将其应用于流数据的序列学习和预测问题。我们显示该模型能够使用无监督的类似Hebbian的学习规则连续学习大量可变阶时间序列。由模型形成的稀疏时间代码可以通过维持多个预测,直到有足够的歧义证据来稳健地处理分支时间序列。我们将HTM序列记忆与其他序列学习算法进行比较,包括统计方法-自回归综合移动平均值;前馈神经网络-时延神经网络和在线顺序极限学习机;和递归神经网络-关于人工和现实数据的序列预测问题的长期短期记忆和回声状态网络。 HTM模型可达到与其他最新算法相当的准确性。该模型还展示了对于序列学习至关重要的属性,包括连续在线学习,具有使用高阶统计量处理多个预测和分支序列的能力,对传感器噪声的鲁棒性和容错能力以及无需特定于任务的超参数调整的良好性能。因此,HTM序列记忆不仅可以增进我们对大脑如何解决序列学习问题的理解,而且还可以应用于来自连续数据流的现实世界中的序列学习问题。

著录项

  • 来源
    《Neural computation》 |2016年第11期|2474-2504|共31页
  • 作者单位

    Numenta Inc. Redwood City CA 94063 U.S.A. ycui@numenta.com;

    Numenta Inc. Redwood City CA 94063 U.S.A. sahmad@numenta.com;

    Numenta Inc. Redwood City CA 94063 U.S.A. jhawkins@numenta.com;

  • 收录信息 美国《科学引文索引》(SCI);美国《化学文摘》(CA);
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号