首页> 外文学位 >A Lyapunov method for correlational learning in two layer neural networks.
【24h】

A Lyapunov method for correlational learning in two layer neural networks.

机译:两层神经网络中用于相关学习的Lyapunov方法。

获取原文
获取原文并翻译 | 示例

摘要

A class of two layer networks is defined. There are feedforward connections between input and output layers and lateral connections within the output layer. A single input to the network consists of fixed pattern of activation in the input layer. The lateral weights are fixed and symmetric and constrained so that for any given input pattern x and feedforward weight matrix W, the activation dynamics within the output layer has a globally attracting equilibrium {dollar} z = F(x, W).{dollar} Input patterns are chosen ergodically from a fixed, finite set and each feedforward weight is changed or learned according to the average correlation of activity at either end of the connection.; The main result of the dissertation is to produce a Lyapunov function for the averaged learning equations for this class of neural networks. In addition, two saturation results are proved concerning existence of solutions in which outputs are near their limiting values.; These results are applied to networks with two different patterns of lateral connectivity. In the first application, uniform lateral inhibition is used to implement "soft competition" within a layer of category detecting nodes; algebraic conditions on the resulting categories are derived. In the second network, both the input layer and output layer consist of a one dimensional "ring" of nodes and the lateral connectivity is "center-surround". A topographic solution for this network is a locally stable configuration of the feedforward weight matrix in which the input/output function F commutes with translation. A geometrical representation of existence conditions for such solutions is presented.; Assuming a high degree of symmetry and synchrony, the method is extended to include the amplitude equations for networks with oscillatory dynamics. Finally, it is shown how the structure of the Lyapunov function suggests a general approach to a broader class of such two layer networks.
机译:定义了一类两层网络。输入和输出层之间存在前馈连接,输出层中存在横向连接。对网络的单个输入由输入层中的固定激活模式组成。横向权重是固定且对称且受约束的,因此对于任何给定的输入模式x和前馈权重矩阵W,输出层内的激活动力学具有全局吸引平衡{dollar} z = F(x,W)。{dollar}从固定的有限集合中人体工学地选择输入模式,并且根据连接两端的活动的平均相关性来更改或学习每个前馈权重。论文的主要结果是为此类神经网络的平均学习方程生成Lyapunov函数。另外,关于输出存在接近极限值的解的存在,证明了两个饱和结果。这些结果适用于具有两种不同横向连接模式的网络。在第一个应用中,统一的横向抑制被用于在类别检测节点的一层内实现“软竞争”。得出所得类别的代数条件。在第二个网络中,输入层和输出层均由一维“环”节点组成,横向连接为“中心环绕”。该网络的拓扑解决方案是前馈权重矩阵的局部稳定配置,其中输入/输出函数F转换为平移。给出了这种解的存在条件的几何表示。假设高度的对称性和同步性,该方法被扩展为包括具有振荡动力学的网络的振幅方程。最后,它显示了Lyapunov函数的结构如何为这种两层网络的更广泛类别提出了一种通用方法。

著录项

  • 作者

    Troyer, Todd William.;

  • 作者单位

    University of California, Berkeley.;

  • 授予单位 University of California, Berkeley.;
  • 学科 Biology Neuroscience.; Mathematics.; Artificial Intelligence.
  • 学位 Ph.D.
  • 年度 1993
  • 页码 106 p.
  • 总页数 106
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类 神经科学;数学;人工智能理论;
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号