首页> 外文学位 >Complex Neural Computation with Simple Digital Neurons.
【24h】

Complex Neural Computation with Simple Digital Neurons.

机译:简单数字神经元的复杂神经计算。

获取原文
获取原文并翻译 | 示例

摘要

The desire to understand, simulate, and capture the computational capability of the brain is not a new one. However, in recent years, many advances have been made towards building better models of neurons and cortical networks. Furthermore, a number of high profile projects have proposed, designed, and fabricated neuromorphic substrates inspired by the structure, organization, and behavior of biological brains.;This dissertation explores both the software and hardware elements of these neuromorphic systems. On the software side, this dissertation begins with an exploration of the leaky integrate-and-fire (LIF) spiking neuron, and demonstrates that a network composed of simple LIF neurons is capable of simple object recognition and motion detecting tasks. Furthermore, a number of complex neuronal behaviors which significantly extend the computational power of the LIF neuron are identified. This dissertation proposes that an extended LIF neuron model can be used to construct a large scale functional model of the visual cortex with metastable attractor dynamics. This hierarchical metastable attractor is capable of invariant object recognition, image reconstruction, working memory tasks, and demonstrates functional integration across multiple modeled regions.;On the hardware side, this dissertation investigates the challenges associated with neuromorphic hardware in the context of IBM's Neurosynaptic Core. This neuromorphic substrate, composed of simple digital neurons, highlights the neuromorphic semantic gap that exists between software models such as the ones described in this dissertation, and the hardware on which they will be deployed. This dissertation demonstrates how this semantic gap can be effectively bridged, and proposes a number of automated techniques for deploying large scale cortical models on the Neurosynaptic Core hardware.
机译:了解,模拟和捕捉大脑的计算能力的愿望并不是新的。但是,近年来,在建立更好的神经元和皮层网络模型方面取得了许多进展。此外,受生物大脑的结构,组织和行为的启发,许多高水平的项目已经提出,设计和制造了神经形态基质。;本文探讨了这些神经形态系统的软件和硬件元素。在软件方面,本文首先探讨了泄漏的积分并发射(LIF)尖峰神经元,并证明了由简单LIF神经元组成的网络能够实现简单的对象识别和运动检测任务。此外,还发现了许多复杂的神经元行为,这些行为显着扩展了LIF神经元的计算能力。提出了扩展的LIF神经元模型可用于构建具有亚稳态吸引子动力学的视觉皮层大规模功能模型。这种分层的亚稳态吸引子能够进行不变的对象识别,图像重建,工作记忆任务,并演示跨多个建模区域的功能集成。在硬件方面,本论文研究了在IBM Neurosynaptic Core环境下与神经形态硬件相关的挑战。由简单的数字神经元组成的这种神经形态底物突出了软件形态(如本文所描述的)与将在其上部署的硬件之间存在的神经形态语义鸿沟。本文论证了如何有效地弥合这种语义鸿沟,并提出了许多在Neurosynaptic Core硬件上部署大规模皮层模型的自动化技术。

著录项

  • 作者

    Nere, Andrew Thomas.;

  • 作者单位

    The University of Wisconsin - Madison.;

  • 授予单位 The University of Wisconsin - Madison.;
  • 学科 Computer engineering.;Neurobiology.
  • 学位 Ph.D.
  • 年度 2013
  • 页码 182 p.
  • 总页数 182
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号