首页> 外文会议> >Role of activation function on hidden units for sample recording in three-layer neural networks
【24h】

Role of activation function on hidden units for sample recording in three-layer neural networks

机译:激活函数在三层神经网络中用于记录样本的隐藏单元上的作用

获取原文

摘要

It is shown that the k hidden units with asymptotic activation function are able to transfer any given k+1 different inputs to linearly independent GHUVs (generated hidden unit vectors) by properly setting weights and thresholds. The number of hidden units with the LIT (linearly independent transformation) capability for the polynomial activation function is limited by the order of polynomials. For analytic asymptotic activation functions and given different inputs, the LIT is a generic capability and a probability 1 capability in setting weights and thresholds randomly. It is a generic and a probability 1 property for any random input if the weight and threshold setting has LIT capability for some k+1 inputs. For three-layer nets with k hidden units, in which the activation function is asymptotic and the output layer is without activation function, they are sufficient to record k+1 arbitrary real samples. It is probability 0 to record k+2 random real samples if the activation is a unit step function. This is true for the sigmoid function in the case of associative memory. These conclusions lead to a scheme for understanding associative memory in the three-layer networks.
机译:结果表明,具有渐近激活功能的K隐藏单元能够通过适当地设置权重和阈值来将任何给定的K + 1个不同的输入传送到线性独立的GHUV(生成的隐藏单元向量)。多项式激活函数的LIT(线性独立变换)能力的隐藏单元的数量受多项式的顺序受限。对于分析渐近激活功能并给出不同的输入,LIT是随机设置权重和阈值的通用能力和概率1能力。如果权重和阈值设置为某些K + 1输入的能力为任何随机输入,它是一种通用和概率1属性。对于具有K隐藏单元的三层网,其中激活函数是渐近的,输出层没有激活函数,它们足以记录K + 1个任意实际样本。如果激活是单位步进功能,则概率0以记录K + 2随机实际样本。在关联内存的情况下,这对于SIGMOID函数是如此。这些结论导致了一个在三层网络中了解关联内存的方案。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号