首页> 外文期刊>Neurocomputing >A Bayesian model for canonical circuits in the neocortex for parallelized and incremental learning of symbol representations
【24h】

A Bayesian model for canonical circuits in the neocortex for parallelized and incremental learning of symbol representations

机译:贝叶斯模型,用于新皮质中的规范电路,用于符号表示的并行和增量学习

获取原文
获取原文并翻译 | 示例
           

摘要

We present a Bayesian model for parallelized canonical circuits in the neocortex, which can partition a cognitive context into orthogonal symbol representations. The model is capable of learning from infinite sensory streams, updating itself with every new instance and without having to keep instances older than the last seen instance per symbol. The inherently incremental and parallel qualities of the model, allow it to scale to any number of symbols as they appear in the sensory stream, and to transparently follow non-stationary distributions for existing symbols. These qualities are made possible in part by a novel Bayesian inference method, which can run Metropolis-Hastings incrementally on a data stream, and significantly outperforms particle filters in a Bayesian neural network application.
机译:我们提出了新皮层中并行规范电路的贝叶斯模型,该模型可以将认知上下文划分为正交符号表示。该模型能够从无限的感官流中学习,使用每个新实例进行更新,而不必使每个实例的符号早于最后看到的实例。模型固有的增量和并行质量,使其可以缩放到出现在感觉流中的任意数量的符号,并透明地遵循现有符号的非平稳分布。这些质量在某种程度上是通过新颖的贝叶斯推理方法实现的,该方法可以在数据流上逐步运行Metropolis-Hastings,并且在贝叶斯神经网络应用程序中的性能明显优于粒子过滤器。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号