首页> 外文会议>International conference on artificial neural networks >Stochasticity, Spike Timing, and a Layered Architecture for Finding Iterative Roots
【24h】

Stochasticity, Spike Timing, and a Layered Architecture for Finding Iterative Roots

机译:随机性,峰值定时和用于查找迭代根的分层体系结构

获取原文

摘要

The human brain and human intelligence has come a long way having evolved the ability to do many things including solving difficult mathematical problems and understanding complex operations and computations. How these abilities are implemented within the brain is poorly understood, but typically tackled by asking what type of computations networks of neurons are capable of? Previous studies have shown how both artificial neural networks (based upon MultiLayer Perceptrons MLPs) and networks of (biologically inspired) spiking neurons can solve single tasks efficiently, such as character recognition or nonlinear function approximation [1, 5, 6]. There are, however, many nontrivial yet important industrial and physical problems that rely upon computations based on iteration and composition. There have been rare demonstrations of neural networks solving multiple tasks simultaneously but the few available examples have shown they are able to solve functional equations. This exemplifies the likelihood that the computational power of neural populations has been under estimated and their true capabilities are far greater than than previously thought. Significantly, previous studies have shown that the solution to a particular class of functional equations, called the functional iterative root or half-iterate, is attainable using MLPs and is continuous in nature [3, 4]. Methods which employ networks of spiking neurons have, til now, shown that piecewise continuous solutions are obtainable [2]. Here, we demonstrate that taking advantage of the stochastic or probabilistic nature of spike generation and population coding, spiking neural networks can learn to find solutions to iterative root that are continuous in nature. Specifically, we show how plasticity, the stochastic nature of neuronal spike generation, and population coding allows spiking neural networks to find solutions to functional equations, like the iterative root of monotonically increasing functions, in a continuous manner. Significantly, our work expands the foundations of neural-based computation by demonstrating a nontrivial underlying computational principle: robustness through uncertainty.
机译:人类的大脑和人类的智能已经发展了很长的路要走,它具有完成许多事情的能力,包括解决困难的数学问题以及理解复杂的运算和计算。人们对如何在大脑中实现这些功能了解甚少,但通常是通过询问神经元具有哪种类型的计算网络来解决?先前的研究表明,人工神经网络(基于多层感知器MLP)和(受生物启发的)尖峰神经元网络如何能够有效地解决单个任务,例如字符识别或非线性函数逼近[1、5、6]。但是,存在许多非重要但重要的工业和物理问题,这些问题依赖于基于迭代和组成的计算。很少有神经网络同时解决多个任务的演示,但很少有可用的例子表明它们能够解函数方程。这说明了神经种群的计算能力被低估并且它们的真实能力比以前认为的要大得多的可能性。重要的是,先前的研究表明,使用MLP可以实现对一类特定的函数方程(称为函数迭代根或半迭代)的解,并且该解本质上是连续的[3,4]。到现在,采用尖峰神经元网络的方法已经表明,可以得到分段的连续解[2]。在这里,我们证明了利用尖峰生成和种群编码的随机或概率性质,尖峰神经网络可以学习找到本质上连续的迭代根的解决方案。具体而言,我们展示了可塑性,神经元尖峰生成的随机性以及种群编码如何使尖峰神经网络以连续的方式找到功能方程的解,例如单调递增函数的迭代根。值得注意的是,我们的工作通过展示非平凡的基础计算原理:通过不确定性实现鲁棒性,扩展了基于神经计算的基础。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号