首页> 外文期刊>IEEE Transactions on Neural Networks >Parallel, self-organizing, hierarchical neural networks with continuous inputs and outputs
【24h】

Parallel, self-organizing, hierarchical neural networks with continuous inputs and outputs

机译:具有连续输入和输出的并行,自组织,分层神经网络

获取原文
获取原文并翻译 | 示例
获取外文期刊封面目录资料

摘要

Parallel, self-organizing, hierarchical neural networks (PSHNN's) are multistage networks in which stages operate in parallel rather than in series during testing. Each stage can be any particular type of network. Previous PSHNN's assume quantized, say, binary outputs. A new type of PSHNN is discussed such that the outputs are allowed to be continuous-valued. The performance of the resulting networks is tested in the problem of predicting speech signal samples from past samples. Three types of networks in which the stages are learned by the delta rule, sequential least-squares, and the backpropagation (BP) algorithm, respectively, are described. In all cases studied, the new networks achieve better performance than linear prediction. A revised BP algorithm is discussed for learning input nonlinearities. When the BP algorithm is to be used, better performance is achieved when a single BP network is replaced by a PSHNN of equal complexity in which each stage is a BP network of smaller complexity than the single BP network.
机译:并行,自组织的分层神经网络(PSHNN)是多阶段网络,其中阶段在测试过程中并行而不是串行操作。每个阶段可以是任何特定类型的网络。先前的PSHNN假设采用量化的二进制输出。讨论了一种新型的PSHNN,以便允许输出为连续值。在从过去的样本预测语音信号样本的问题中测试了所得网络的性能。描述了三种类型的网络,其中分别通过增量规则学习阶段,顺序最小二乘和反向传播(BP)算法。在所有研究的案例中,新网络都比线性预测具有更好的性能。讨论了一种改进的BP算法,用于学习输入非线性。当使用BP算法时,当单个BP网络被具有相同复杂度的PSHNN代替时,可以获得更好的性能,其中每个阶段都是比单个BP网络复杂度小的BP网络。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号