首页> 外文会议>Science and Information Conference >An on-line learning algorithm using the decomposition and coordination of a neural network
【24h】

An on-line learning algorithm using the decomposition and coordination of a neural network

机译:利用神经网络分解与协调的在线学习算法

获取原文

摘要

A Neural network with a feed-forward structure with one input, one hidden and one output layer can be presented as a hierarchical two-level structure with two independent subnetworks on both the first and the second level. This process is known as decomposition of an Artificial Neural Network (ANN) into two sub-networks. Two target functions are defined: the output target function Ψ, which defines an error function for all networks. The local target function Φ which defines the error of the first and second layer sub-network adjustment. For the coordination level, two independent functions are defined: G(V) for feed forward and H(V) for back forward. The coordinator ensures that learning algorithms for both levels, first and second, are concatenated. Solving local tasks provides for the achievement of the minimum of the global target function Ψ (global task). The article defines the obligatory conditions that have to be fulfilled (regarding both the first and the second level tasks), for the algorithm to be convergent and achieve the minimum of the global target function (the output function). A three-argument function allows us to study the general learning characteristics for both the first and the second level. Final results are discussed and the positive and negative parameters of the two stage learning algorithm are presented. Matrix weight coefficients are modified after each presentation of learning vectors X (input) and Z (output).
机译:具有前馈结构(具有一个输入,一个隐藏层和一个输出层)的神经网络可以表示为分层的二级结构,在第一级和第二级上都有两个独立的子网。此过程称为将人工神经网络(ANN)分解为两个子网。定义了两个目标函数:输出目标函数Ψ,该目标函数定义了所有网络的错误函数。局部目标函数Φ定义了第一层和第二层子网调整的误差。对于协调级别,定义了两个独立的函数:G(V)用于前馈,H(V)用于前馈。协调器确保将第一级和第二级两个级别的学习算法连接在一起。解决局部任务可实现最小的全局目标函数Ψ(全局任务)。本文定义了必须满足的强制性条件(关于第一级任务和第二级任务),才能使算法收敛并实现全局目标函数(输出函数)的最小值。三参数函数使我们能够研究第一和第二层次的一般学习特征。讨论了最终结果,并给出了两阶段学习算法的正负参数。在每次表示学习向量X(输入)和Z(输出)后,都会修改矩阵权重系数。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号