首页> 外文会议>German Conference on Artificial Intelligence >Sequential Learning Algorithm of Neural Networks Systems for Time Series
【24h】

Sequential Learning Algorithm of Neural Networks Systems for Time Series

机译:时间序列神经网络系统的顺序学习算法

获取原文

摘要

This article describes a new structure to create a RBF neural network that uses regression weights to replace the constant weights normally used. These regression weights are assumed to be functions of input variables. In this way the number of hidden units within a RBF neural network is reduced. A new type of nonlinear function is proposed: the pseudo-gaussian function. With this, the neural system gains flexibility, as the neurons possess an activation field that does not necessarily have to be symmnetric with respect to the centre or to the location of the neuron in the input space. In addition to this new structure, we propose a sequential learning algorithm, which is able to adapt the structure of the network; with this, it is possible to create new hidden units and also to detect and remove inactive units. We have presented conditions to increase or decrease the number of neurons, based on the novelty of the data and on the overall behaviour of the neural system, (for example pruning the hidden units that have lowest relevance to the neural system using Orthogonal Least Squares (OLS) and other operators), respectively. The feasibility of the evolution and learning capability of the resulting algorithm for the neural network is demonstrated by predicting time series.
机译:本文介绍了一种新的结构,用于创建使用回归权重的RBF神经网络替换通常使用的恒定权重。假设这些回归权重是输入变量的函数。以这种方式,RBF神经网络内的隐藏单元的数量减少。提出了一种新型的非线性功能:伪高斯函数。由此,神经系统提高了灵活性,因为神经元具有不一定必须相对于中心或内核中神经元的位置对应的激活场。除了这种新结构外,我们还提出了一种顺序学习算法,能够调整网络的结构;有了这个,可以创建新的隐藏单元,也可以检测和删除非活动单元。根据数据的新颖性和神经系统的整体行为,我们提出了增加或减少神经元数量的条件,例如修剪与使用正交最小二乘的神经系统具有最低相关性的隐藏单元(分别为OLS)和其他运营商)。通过预测时间序列,证明了神经网络算法的进化和学习能力的可行性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号